Dec 03 19:52:35.473455 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 19:52:35.699557 master-0 kubenswrapper[4813]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:52:35.700968 master-0 kubenswrapper[4813]: I1203 19:52:35.700016 4813 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703368 4813 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703385 4813 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703390 4813 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703394 4813 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703398 4813 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703402 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703405 4813 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703409 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703413 4813 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703417 4813 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703420 4813 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703424 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703428 4813 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703433 4813 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:52:35.703415 master-0 kubenswrapper[4813]: W1203 19:52:35.703438 4813 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703452 4813 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703456 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703460 4813 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703465 4813 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703471 4813 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703475 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703478 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703483 4813 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703488 4813 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703492 4813 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703496 4813 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703500 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703504 4813 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703509 4813 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703513 4813 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703517 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703522 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:52:35.704160 master-0 kubenswrapper[4813]: W1203 19:52:35.703526 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703530 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703534 4813 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703537 4813 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703541 4813 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703544 4813 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703548 4813 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703552 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703555 4813 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703559 4813 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703562 4813 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703566 4813 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703571 4813 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703574 4813 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703577 4813 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703582 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703586 4813 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703589 4813 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703593 4813 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703597 4813 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:52:35.704984 master-0 kubenswrapper[4813]: W1203 19:52:35.703600 4813 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703604 4813 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703607 4813 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703611 4813 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703614 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703618 4813 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703622 4813 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703626 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703629 4813 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703634 4813 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703638 4813 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703642 4813 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703646 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703649 4813 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703653 4813 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703656 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703659 4813 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703663 4813 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703667 4813 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: W1203 19:52:35.703670 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:52:35.705934 master-0 kubenswrapper[4813]: I1203 19:52:35.703742 4813 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703751 4813 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703758 4813 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703763 4813 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703768 4813 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703784 4813 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703790 4813 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703795 4813 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703800 4813 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703805 4813 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703809 4813 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703813 4813 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703817 4813 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703821 4813 flags.go:64] FLAG: --cgroup-root="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703825 4813 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703830 4813 flags.go:64] FLAG: --client-ca-file="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703834 4813 flags.go:64] FLAG: --cloud-config="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703838 4813 flags.go:64] FLAG: --cloud-provider="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.703842 4813 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704037 4813 flags.go:64] FLAG: --cluster-domain="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704042 4813 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704046 4813 flags.go:64] FLAG: --config-dir="" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704051 4813 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704055 4813 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 19:52:35.706768 master-0 kubenswrapper[4813]: I1203 19:52:35.704061 4813 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704066 4813 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704070 4813 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704075 4813 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704079 4813 flags.go:64] FLAG: --contention-profiling="false" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704084 4813 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704088 4813 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704092 4813 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704096 4813 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704101 4813 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704106 4813 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704110 4813 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704114 4813 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704119 4813 flags.go:64] FLAG: --enable-server="true" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704123 4813 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704129 4813 flags.go:64] FLAG: --event-burst="100" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704134 4813 flags.go:64] FLAG: --event-qps="50" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704139 4813 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704143 4813 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704148 4813 flags.go:64] FLAG: --eviction-hard="" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704153 4813 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704157 4813 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704162 4813 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704166 4813 flags.go:64] FLAG: --eviction-soft="" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704171 4813 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 19:52:35.707923 master-0 kubenswrapper[4813]: I1203 19:52:35.704176 4813 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704180 4813 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704185 4813 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704195 4813 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704199 4813 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704203 4813 flags.go:64] FLAG: --feature-gates="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704208 4813 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704213 4813 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704217 4813 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704221 4813 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704226 4813 flags.go:64] FLAG: --healthz-port="10248" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704230 4813 flags.go:64] FLAG: --help="false" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704234 4813 flags.go:64] FLAG: --hostname-override="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704238 4813 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704242 4813 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704246 4813 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704250 4813 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704254 4813 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704258 4813 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704262 4813 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704266 4813 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704270 4813 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704274 4813 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704278 4813 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704282 4813 flags.go:64] FLAG: --kube-reserved="" Dec 03 19:52:35.709091 master-0 kubenswrapper[4813]: I1203 19:52:35.704286 4813 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704290 4813 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704294 4813 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704298 4813 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704302 4813 flags.go:64] FLAG: --lock-file="" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704306 4813 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704310 4813 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704315 4813 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704321 4813 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704325 4813 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704330 4813 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704334 4813 flags.go:64] FLAG: --logging-format="text" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704338 4813 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704343 4813 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704347 4813 flags.go:64] FLAG: --manifest-url="" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704351 4813 flags.go:64] FLAG: --manifest-url-header="" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704356 4813 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704360 4813 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704366 4813 flags.go:64] FLAG: --max-pods="110" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704370 4813 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704373 4813 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704377 4813 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704381 4813 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704386 4813 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 19:52:35.710340 master-0 kubenswrapper[4813]: I1203 19:52:35.704391 4813 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704395 4813 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704404 4813 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704409 4813 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704413 4813 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704417 4813 flags.go:64] FLAG: --pod-cidr="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704421 4813 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704427 4813 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704431 4813 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704435 4813 flags.go:64] FLAG: --pods-per-core="0" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704439 4813 flags.go:64] FLAG: --port="10250" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704443 4813 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704447 4813 flags.go:64] FLAG: --provider-id="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704451 4813 flags.go:64] FLAG: --qos-reserved="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704455 4813 flags.go:64] FLAG: --read-only-port="10255" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704459 4813 flags.go:64] FLAG: --register-node="true" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704463 4813 flags.go:64] FLAG: --register-schedulable="true" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704467 4813 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704474 4813 flags.go:64] FLAG: --registry-burst="10" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704480 4813 flags.go:64] FLAG: --registry-qps="5" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704485 4813 flags.go:64] FLAG: --reserved-cpus="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704489 4813 flags.go:64] FLAG: --reserved-memory="" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704496 4813 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704500 4813 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 19:52:35.711408 master-0 kubenswrapper[4813]: I1203 19:52:35.704504 4813 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704509 4813 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704513 4813 flags.go:64] FLAG: --runonce="false" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704517 4813 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704521 4813 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704525 4813 flags.go:64] FLAG: --seccomp-default="false" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704530 4813 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704534 4813 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704538 4813 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704542 4813 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704546 4813 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704550 4813 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704554 4813 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704558 4813 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704562 4813 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704566 4813 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704570 4813 flags.go:64] FLAG: --system-cgroups="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704574 4813 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704581 4813 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704585 4813 flags.go:64] FLAG: --tls-cert-file="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704588 4813 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704594 4813 flags.go:64] FLAG: --tls-min-version="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704597 4813 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704602 4813 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704606 4813 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 19:52:35.712644 master-0 kubenswrapper[4813]: I1203 19:52:35.704610 4813 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: I1203 19:52:35.704614 4813 flags.go:64] FLAG: --v="2" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: I1203 19:52:35.704620 4813 flags.go:64] FLAG: --version="false" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: I1203 19:52:35.704625 4813 flags.go:64] FLAG: --vmodule="" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: I1203 19:52:35.704630 4813 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: I1203 19:52:35.704634 4813 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704725 4813 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704729 4813 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704733 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704737 4813 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704740 4813 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704744 4813 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704749 4813 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704754 4813 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704758 4813 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704762 4813 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704767 4813 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704794 4813 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704799 4813 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704803 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704807 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:52:35.714214 master-0 kubenswrapper[4813]: W1203 19:52:35.704811 4813 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704814 4813 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704818 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704822 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704825 4813 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704829 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704832 4813 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704836 4813 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704839 4813 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704843 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704847 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704851 4813 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704856 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704862 4813 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704866 4813 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704870 4813 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704873 4813 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704878 4813 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704881 4813 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:52:35.715537 master-0 kubenswrapper[4813]: W1203 19:52:35.704884 4813 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704888 4813 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704891 4813 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704895 4813 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704898 4813 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704902 4813 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704905 4813 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704909 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704913 4813 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704917 4813 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704920 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704923 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704938 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704942 4813 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704946 4813 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704950 4813 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704955 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704959 4813 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704963 4813 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704968 4813 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:52:35.716942 master-0 kubenswrapper[4813]: W1203 19:52:35.704972 4813 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.704977 4813 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.704982 4813 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.704987 4813 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.704992 4813 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.704996 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705002 4813 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705006 4813 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705009 4813 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705014 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705017 4813 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705021 4813 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705025 4813 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705028 4813 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705032 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705036 4813 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705039 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:52:35.718016 master-0 kubenswrapper[4813]: W1203 19:52:35.705043 4813 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: I1203 19:52:35.705053 4813 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: I1203 19:52:35.717477 4813 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: I1203 19:52:35.717561 4813 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717702 4813 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717714 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717725 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717734 4813 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717744 4813 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717752 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717760 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717768 4813 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717804 4813 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717815 4813 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:52:35.718947 master-0 kubenswrapper[4813]: W1203 19:52:35.717829 4813 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717838 4813 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717847 4813 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717855 4813 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717864 4813 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717872 4813 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717880 4813 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717887 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717895 4813 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717903 4813 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717911 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717918 4813 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717926 4813 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717933 4813 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717941 4813 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717950 4813 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717958 4813 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717968 4813 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717978 4813 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:52:35.719708 master-0 kubenswrapper[4813]: W1203 19:52:35.717988 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.717998 4813 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718008 4813 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718018 4813 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718028 4813 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718036 4813 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718045 4813 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718054 4813 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718063 4813 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718070 4813 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718078 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718086 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718096 4813 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718106 4813 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718115 4813 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718123 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718132 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718140 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718149 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718157 4813 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:52:35.720578 master-0 kubenswrapper[4813]: W1203 19:52:35.718165 4813 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718174 4813 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718183 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718191 4813 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718199 4813 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718207 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718215 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718223 4813 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718231 4813 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718239 4813 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718249 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718257 4813 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718265 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718272 4813 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718283 4813 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718292 4813 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718299 4813 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718308 4813 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718318 4813 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718328 4813 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:52:35.721856 master-0 kubenswrapper[4813]: W1203 19:52:35.718337 4813 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718345 4813 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718353 4813 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: I1203 19:52:35.718365 4813 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718603 4813 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718618 4813 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718626 4813 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718635 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718643 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718652 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718659 4813 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718667 4813 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718675 4813 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718683 4813 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718691 4813 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:52:35.723350 master-0 kubenswrapper[4813]: W1203 19:52:35.718699 4813 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718707 4813 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718714 4813 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718722 4813 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718730 4813 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718737 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718746 4813 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718754 4813 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718763 4813 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718773 4813 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718806 4813 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718814 4813 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718823 4813 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718831 4813 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718838 4813 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718846 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718854 4813 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718862 4813 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718870 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:52:35.724184 master-0 kubenswrapper[4813]: W1203 19:52:35.718877 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718885 4813 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718892 4813 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718901 4813 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718911 4813 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718920 4813 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718929 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718937 4813 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718948 4813 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718958 4813 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718966 4813 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718975 4813 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718984 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.718992 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719002 4813 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719010 4813 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719018 4813 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719025 4813 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719033 4813 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:52:35.725319 master-0 kubenswrapper[4813]: W1203 19:52:35.719040 4813 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719049 4813 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719057 4813 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719065 4813 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719073 4813 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719081 4813 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719089 4813 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719097 4813 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719104 4813 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719112 4813 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719120 4813 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719128 4813 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719136 4813 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719143 4813 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719153 4813 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719163 4813 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719172 4813 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719180 4813 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719188 4813 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719198 4813 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:52:35.726242 master-0 kubenswrapper[4813]: W1203 19:52:35.719211 4813 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: W1203 19:52:35.719223 4813 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: W1203 19:52:35.719234 4813 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.719251 4813 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.719570 4813 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.722756 4813 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.723699 4813 server.go:997] "Starting client certificate rotation" Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.723733 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 19:52:35.727153 master-0 kubenswrapper[4813]: I1203 19:52:35.724088 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 19:52:35.732697 master-0 kubenswrapper[4813]: I1203 19:52:35.732635 4813 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 19:52:35.735312 master-0 kubenswrapper[4813]: I1203 19:52:35.735238 4813 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 19:52:35.736068 master-0 kubenswrapper[4813]: E1203 19:52:35.735952 4813 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:35.748286 master-0 kubenswrapper[4813]: I1203 19:52:35.748238 4813 log.go:25] "Validated CRI v1 runtime API" Dec 03 19:52:35.751955 master-0 kubenswrapper[4813]: I1203 19:52:35.751913 4813 log.go:25] "Validated CRI v1 image API" Dec 03 19:52:35.754387 master-0 kubenswrapper[4813]: I1203 19:52:35.754344 4813 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 19:52:35.757136 master-0 kubenswrapper[4813]: I1203 19:52:35.757083 4813 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a110c2ad-b51b-427d-8eb4-4344f49e01ee:/dev/vda3] Dec 03 19:52:35.757208 master-0 kubenswrapper[4813]: I1203 19:52:35.757128 4813 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Dec 03 19:52:35.790473 master-0 kubenswrapper[4813]: I1203 19:52:35.789995 4813 manager.go:217] Machine: {Timestamp:2025-12-03 19:52:35.787679486 +0000 UTC m=+0.216477975 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:9870f3c6b33d40089e247d1fa3d9248c SystemUUID:9870f3c6-b33d-4008-9e24-7d1fa3d9248c BootID:2118df0c-6317-4582-908c-71a63e50558d Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c1:91:ba Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:45:dc:6d Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:1a:18:db:b8:db:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 19:52:35.790473 master-0 kubenswrapper[4813]: I1203 19:52:35.790399 4813 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 19:52:35.790808 master-0 kubenswrapper[4813]: I1203 19:52:35.790663 4813 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 19:52:35.791827 master-0 kubenswrapper[4813]: I1203 19:52:35.791726 4813 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 19:52:35.792172 master-0 kubenswrapper[4813]: I1203 19:52:35.792106 4813 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 19:52:35.792512 master-0 kubenswrapper[4813]: I1203 19:52:35.792162 4813 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 19:52:35.792597 master-0 kubenswrapper[4813]: I1203 19:52:35.792539 4813 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 19:52:35.792597 master-0 kubenswrapper[4813]: I1203 19:52:35.792557 4813 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 19:52:35.792990 master-0 kubenswrapper[4813]: I1203 19:52:35.792948 4813 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 19:52:35.793055 master-0 kubenswrapper[4813]: I1203 19:52:35.793018 4813 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 19:52:35.793547 master-0 kubenswrapper[4813]: I1203 19:52:35.793503 4813 state_mem.go:36] "Initialized new in-memory state store" Dec 03 19:52:35.793694 master-0 kubenswrapper[4813]: I1203 19:52:35.793656 4813 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 19:52:35.795508 master-0 kubenswrapper[4813]: I1203 19:52:35.795456 4813 kubelet.go:418] "Attempting to sync node with API server" Dec 03 19:52:35.795508 master-0 kubenswrapper[4813]: I1203 19:52:35.795494 4813 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 19:52:35.795706 master-0 kubenswrapper[4813]: I1203 19:52:35.795536 4813 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 19:52:35.795706 master-0 kubenswrapper[4813]: I1203 19:52:35.795559 4813 kubelet.go:324] "Adding apiserver pod source" Dec 03 19:52:35.795706 master-0 kubenswrapper[4813]: I1203 19:52:35.795584 4813 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 19:52:35.798096 master-0 kubenswrapper[4813]: I1203 19:52:35.798033 4813 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 19:52:35.799133 master-0 kubenswrapper[4813]: I1203 19:52:35.799087 4813 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 19:52:35.799329 master-0 kubenswrapper[4813]: W1203 19:52:35.799235 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:35.799329 master-0 kubenswrapper[4813]: I1203 19:52:35.799305 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 19:52:35.799329 master-0 kubenswrapper[4813]: I1203 19:52:35.799325 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 19:52:35.799329 master-0 kubenswrapper[4813]: I1203 19:52:35.799333 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 19:52:35.799329 master-0 kubenswrapper[4813]: I1203 19:52:35.799340 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799347 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799355 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: W1203 19:52:35.799223 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: E1203 19:52:35.799387 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: E1203 19:52:35.799343 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799362 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799466 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799486 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799503 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799522 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 19:52:35.799729 master-0 kubenswrapper[4813]: I1203 19:52:35.799548 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 19:52:35.800295 master-0 kubenswrapper[4813]: I1203 19:52:35.799880 4813 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 19:52:35.800609 master-0 kubenswrapper[4813]: I1203 19:52:35.800566 4813 server.go:1280] "Started kubelet" Dec 03 19:52:35.801402 master-0 kubenswrapper[4813]: I1203 19:52:35.801173 4813 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 19:52:35.801700 master-0 kubenswrapper[4813]: I1203 19:52:35.801276 4813 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 19:52:35.801700 master-0 kubenswrapper[4813]: I1203 19:52:35.801525 4813 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 19:52:35.801959 master-0 kubenswrapper[4813]: I1203 19:52:35.801877 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:35.802371 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 19:52:35.803169 master-0 kubenswrapper[4813]: I1203 19:52:35.802843 4813 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 19:52:35.803971 master-0 kubenswrapper[4813]: E1203 19:52:35.803366 4813 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187dcc91da19634b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,LastTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:52:35.804088 master-0 kubenswrapper[4813]: I1203 19:52:35.804010 4813 server.go:449] "Adding debug handlers to kubelet server" Dec 03 19:52:35.805146 master-0 kubenswrapper[4813]: I1203 19:52:35.805084 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 19:52:35.805146 master-0 kubenswrapper[4813]: I1203 19:52:35.805139 4813 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 19:52:35.805878 master-0 kubenswrapper[4813]: I1203 19:52:35.805649 4813 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 19:52:35.805878 master-0 kubenswrapper[4813]: I1203 19:52:35.805733 4813 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 19:52:35.806994 master-0 kubenswrapper[4813]: I1203 19:52:35.806933 4813 reconstruct.go:97] "Volume reconstruction finished" Dec 03 19:52:35.806994 master-0 kubenswrapper[4813]: I1203 19:52:35.806976 4813 reconciler.go:26] "Reconciler: start to sync state" Dec 03 19:52:35.807167 master-0 kubenswrapper[4813]: I1203 19:52:35.806969 4813 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 19:52:35.808217 master-0 kubenswrapper[4813]: W1203 19:52:35.808102 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:35.808363 master-0 kubenswrapper[4813]: E1203 19:52:35.808224 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:35.808595 master-0 kubenswrapper[4813]: E1203 19:52:35.806478 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:52:35.808695 master-0 kubenswrapper[4813]: E1203 19:52:35.808600 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 03 19:52:35.810280 master-0 kubenswrapper[4813]: I1203 19:52:35.810219 4813 factory.go:55] Registering systemd factory Dec 03 19:52:35.810280 master-0 kubenswrapper[4813]: I1203 19:52:35.810271 4813 factory.go:221] Registration of the systemd container factory successfully Dec 03 19:52:35.811540 master-0 kubenswrapper[4813]: I1203 19:52:35.811479 4813 factory.go:153] Registering CRI-O factory Dec 03 19:52:35.811540 master-0 kubenswrapper[4813]: I1203 19:52:35.811516 4813 factory.go:221] Registration of the crio container factory successfully Dec 03 19:52:35.811742 master-0 kubenswrapper[4813]: I1203 19:52:35.811638 4813 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 19:52:35.811742 master-0 kubenswrapper[4813]: I1203 19:52:35.811684 4813 factory.go:103] Registering Raw factory Dec 03 19:52:35.811742 master-0 kubenswrapper[4813]: I1203 19:52:35.811720 4813 manager.go:1196] Started watching for new ooms in manager Dec 03 19:52:35.812910 master-0 kubenswrapper[4813]: I1203 19:52:35.812766 4813 manager.go:319] Starting recovery of all containers Dec 03 19:52:35.816198 master-0 kubenswrapper[4813]: E1203 19:52:35.816132 4813 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 03 19:52:35.842240 master-0 kubenswrapper[4813]: I1203 19:52:35.842177 4813 manager.go:324] Recovery completed Dec 03 19:52:35.862118 master-0 kubenswrapper[4813]: I1203 19:52:35.862025 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:35.864188 master-0 kubenswrapper[4813]: I1203 19:52:35.864119 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:35.864294 master-0 kubenswrapper[4813]: I1203 19:52:35.864195 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:35.864294 master-0 kubenswrapper[4813]: I1203 19:52:35.864219 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:35.865381 master-0 kubenswrapper[4813]: I1203 19:52:35.865292 4813 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 19:52:35.865381 master-0 kubenswrapper[4813]: I1203 19:52:35.865322 4813 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 19:52:35.865587 master-0 kubenswrapper[4813]: I1203 19:52:35.865390 4813 state_mem.go:36] "Initialized new in-memory state store" Dec 03 19:52:35.869123 master-0 kubenswrapper[4813]: I1203 19:52:35.869065 4813 policy_none.go:49] "None policy: Start" Dec 03 19:52:35.870354 master-0 kubenswrapper[4813]: I1203 19:52:35.870294 4813 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 19:52:35.870452 master-0 kubenswrapper[4813]: I1203 19:52:35.870392 4813 state_mem.go:35] "Initializing new in-memory state store" Dec 03 19:52:35.909803 master-0 kubenswrapper[4813]: E1203 19:52:35.909242 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:52:35.950593 master-0 kubenswrapper[4813]: I1203 19:52:35.950520 4813 manager.go:334] "Starting Device Plugin manager" Dec 03 19:52:35.950593 master-0 kubenswrapper[4813]: I1203 19:52:35.950589 4813 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 19:52:35.950771 master-0 kubenswrapper[4813]: I1203 19:52:35.950614 4813 server.go:79] "Starting device plugin registration server" Dec 03 19:52:35.951528 master-0 kubenswrapper[4813]: I1203 19:52:35.951220 4813 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 19:52:35.951528 master-0 kubenswrapper[4813]: I1203 19:52:35.951253 4813 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 19:52:35.951528 master-0 kubenswrapper[4813]: I1203 19:52:35.951474 4813 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 19:52:35.951865 master-0 kubenswrapper[4813]: I1203 19:52:35.951738 4813 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 19:52:35.951865 master-0 kubenswrapper[4813]: I1203 19:52:35.951757 4813 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 19:52:35.953413 master-0 kubenswrapper[4813]: E1203 19:52:35.953242 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:52:36.012216 master-0 kubenswrapper[4813]: E1203 19:52:36.011039 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 03 19:52:36.018744 master-0 kubenswrapper[4813]: I1203 19:52:36.018634 4813 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 19:52:36.022122 master-0 kubenswrapper[4813]: I1203 19:52:36.022068 4813 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 19:52:36.022243 master-0 kubenswrapper[4813]: I1203 19:52:36.022169 4813 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 19:52:36.022243 master-0 kubenswrapper[4813]: I1203 19:52:36.022212 4813 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 19:52:36.022579 master-0 kubenswrapper[4813]: E1203 19:52:36.022480 4813 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Dec 03 19:52:36.023689 master-0 kubenswrapper[4813]: W1203 19:52:36.023570 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:36.023847 master-0 kubenswrapper[4813]: E1203 19:52:36.023703 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:36.052432 master-0 kubenswrapper[4813]: I1203 19:52:36.052330 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.053609 master-0 kubenswrapper[4813]: I1203 19:52:36.053555 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.053739 master-0 kubenswrapper[4813]: I1203 19:52:36.053624 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.053739 master-0 kubenswrapper[4813]: I1203 19:52:36.053650 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.053739 master-0 kubenswrapper[4813]: I1203 19:52:36.053707 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:36.055107 master-0 kubenswrapper[4813]: E1203 19:52:36.055036 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:36.123683 master-0 kubenswrapper[4813]: I1203 19:52:36.123537 4813 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0"] Dec 03 19:52:36.123683 master-0 kubenswrapper[4813]: I1203 19:52:36.123668 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.125242 master-0 kubenswrapper[4813]: I1203 19:52:36.125188 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.125363 master-0 kubenswrapper[4813]: I1203 19:52:36.125282 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.125363 master-0 kubenswrapper[4813]: I1203 19:52:36.125300 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.125485 master-0 kubenswrapper[4813]: I1203 19:52:36.125432 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.126039 master-0 kubenswrapper[4813]: I1203 19:52:36.125972 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.126142 master-0 kubenswrapper[4813]: I1203 19:52:36.126110 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.126489 master-0 kubenswrapper[4813]: I1203 19:52:36.126421 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.126568 master-0 kubenswrapper[4813]: I1203 19:52:36.126511 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.126568 master-0 kubenswrapper[4813]: I1203 19:52:36.126538 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.126878 master-0 kubenswrapper[4813]: I1203 19:52:36.126832 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.126970 master-0 kubenswrapper[4813]: I1203 19:52:36.126947 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.127067 master-0 kubenswrapper[4813]: I1203 19:52:36.127037 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.127369 master-0 kubenswrapper[4813]: I1203 19:52:36.127322 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.127443 master-0 kubenswrapper[4813]: I1203 19:52:36.127374 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.127443 master-0 kubenswrapper[4813]: I1203 19:52:36.127392 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.128289 master-0 kubenswrapper[4813]: I1203 19:52:36.128244 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.128374 master-0 kubenswrapper[4813]: I1203 19:52:36.128292 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.128374 master-0 kubenswrapper[4813]: I1203 19:52:36.128308 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.128497 master-0 kubenswrapper[4813]: I1203 19:52:36.128426 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.128497 master-0 kubenswrapper[4813]: I1203 19:52:36.128404 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.128692 master-0 kubenswrapper[4813]: I1203 19:52:36.128511 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.128692 master-0 kubenswrapper[4813]: I1203 19:52:36.128537 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.128692 master-0 kubenswrapper[4813]: I1203 19:52:36.128593 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.128692 master-0 kubenswrapper[4813]: I1203 19:52:36.128638 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.129706 master-0 kubenswrapper[4813]: I1203 19:52:36.129641 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.129706 master-0 kubenswrapper[4813]: I1203 19:52:36.129701 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.129886 master-0 kubenswrapper[4813]: I1203 19:52:36.129725 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.129886 master-0 kubenswrapper[4813]: I1203 19:52:36.129737 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.129886 master-0 kubenswrapper[4813]: I1203 19:52:36.129772 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.129886 master-0 kubenswrapper[4813]: I1203 19:52:36.129816 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.130154 master-0 kubenswrapper[4813]: I1203 19:52:36.129952 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.130293 master-0 kubenswrapper[4813]: I1203 19:52:36.130237 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.130293 master-0 kubenswrapper[4813]: I1203 19:52:36.130284 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131357 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131405 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131425 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131366 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131539 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.131575 master-0 kubenswrapper[4813]: I1203 19:52:36.131560 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.132283 master-0 kubenswrapper[4813]: I1203 19:52:36.131692 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.132283 master-0 kubenswrapper[4813]: I1203 19:52:36.131736 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.132769 master-0 kubenswrapper[4813]: I1203 19:52:36.132739 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.132769 master-0 kubenswrapper[4813]: I1203 19:52:36.132770 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.132769 master-0 kubenswrapper[4813]: I1203 19:52:36.132796 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.208551 master-0 kubenswrapper[4813]: I1203 19:52:36.208469 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.208835 master-0 kubenswrapper[4813]: I1203 19:52:36.208670 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.208835 master-0 kubenswrapper[4813]: I1203 19:52:36.208766 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.209081 master-0 kubenswrapper[4813]: I1203 19:52:36.208877 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.209081 master-0 kubenswrapper[4813]: I1203 19:52:36.208974 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.209286 master-0 kubenswrapper[4813]: I1203 19:52:36.209160 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.209286 master-0 kubenswrapper[4813]: I1203 19:52:36.209211 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.209447 master-0 kubenswrapper[4813]: I1203 19:52:36.209305 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.209513 master-0 kubenswrapper[4813]: I1203 19:52:36.209402 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.209513 master-0 kubenswrapper[4813]: I1203 19:52:36.209490 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.255338 master-0 kubenswrapper[4813]: I1203 19:52:36.255289 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.256759 master-0 kubenswrapper[4813]: I1203 19:52:36.256704 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.256845 master-0 kubenswrapper[4813]: I1203 19:52:36.256767 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.256845 master-0 kubenswrapper[4813]: I1203 19:52:36.256801 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.256980 master-0 kubenswrapper[4813]: I1203 19:52:36.256902 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:36.258150 master-0 kubenswrapper[4813]: E1203 19:52:36.258096 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:36.311192 master-0 kubenswrapper[4813]: I1203 19:52:36.311011 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311192 master-0 kubenswrapper[4813]: I1203 19:52:36.311138 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.311330 master-0 kubenswrapper[4813]: I1203 19:52:36.311198 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.311330 master-0 kubenswrapper[4813]: I1203 19:52:36.311243 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311330 master-0 kubenswrapper[4813]: I1203 19:52:36.311283 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.311330 master-0 kubenswrapper[4813]: I1203 19:52:36.311290 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311330 master-0 kubenswrapper[4813]: I1203 19:52:36.311310 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311323 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311481 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311582 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311644 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311676 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311705 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311737 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311742 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311770 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311755 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311823 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311816 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311884 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311911 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.311941 master-0 kubenswrapper[4813]: I1203 19:52:36.311972 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.312575 master-0 kubenswrapper[4813]: I1203 19:52:36.312015 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.312575 master-0 kubenswrapper[4813]: I1203 19:52:36.312051 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.312575 master-0 kubenswrapper[4813]: I1203 19:52:36.312046 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.312575 master-0 kubenswrapper[4813]: I1203 19:52:36.312095 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.312575 master-0 kubenswrapper[4813]: I1203 19:52:36.312064 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.412415 master-0 kubenswrapper[4813]: I1203 19:52:36.412296 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.412415 master-0 kubenswrapper[4813]: I1203 19:52:36.412351 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.412415 master-0 kubenswrapper[4813]: I1203 19:52:36.412387 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.412415 master-0 kubenswrapper[4813]: I1203 19:52:36.412410 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412461 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412470 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412423 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: E1203 19:52:36.412506 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412557 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412568 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412605 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412661 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412706 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.412755 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.413185 master-0 kubenswrapper[4813]: I1203 19:52:36.413039 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.468471 master-0 kubenswrapper[4813]: I1203 19:52:36.468402 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:36.484307 master-0 kubenswrapper[4813]: I1203 19:52:36.484245 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:36.517094 master-0 kubenswrapper[4813]: I1203 19:52:36.516248 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:52:36.551118 master-0 kubenswrapper[4813]: I1203 19:52:36.551017 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:52:36.556603 master-0 kubenswrapper[4813]: I1203 19:52:36.556554 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:52:36.659447 master-0 kubenswrapper[4813]: I1203 19:52:36.659212 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:36.660886 master-0 kubenswrapper[4813]: I1203 19:52:36.660838 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:36.660999 master-0 kubenswrapper[4813]: I1203 19:52:36.660892 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:36.660999 master-0 kubenswrapper[4813]: I1203 19:52:36.660911 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:36.663828 master-0 kubenswrapper[4813]: I1203 19:52:36.663511 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:36.666085 master-0 kubenswrapper[4813]: E1203 19:52:36.665996 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:36.804118 master-0 kubenswrapper[4813]: I1203 19:52:36.803979 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:36.837181 master-0 kubenswrapper[4813]: W1203 19:52:36.837039 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:36.837335 master-0 kubenswrapper[4813]: E1203 19:52:36.837185 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:37.045755 master-0 kubenswrapper[4813]: W1203 19:52:37.045529 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:37.045755 master-0 kubenswrapper[4813]: E1203 19:52:37.045639 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:37.128099 master-0 kubenswrapper[4813]: W1203 19:52:37.127988 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b95a38663dd6fe34e183818a475977.slice/crio-03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301 WatchSource:0}: Error finding container 03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301: Status 404 returned error can't find the container with id 03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301 Dec 03 19:52:37.142588 master-0 kubenswrapper[4813]: I1203 19:52:37.142556 4813 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:52:37.160277 master-0 kubenswrapper[4813]: W1203 19:52:37.160213 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13238af3704fe583f617f61e755cf4c2.slice/crio-e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382 WatchSource:0}: Error finding container e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382: Status 404 returned error can't find the container with id e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382 Dec 03 19:52:37.174811 master-0 kubenswrapper[4813]: W1203 19:52:37.174742 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78739a7694769882b7e47ea5ac08a10.slice/crio-80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462 WatchSource:0}: Error finding container 80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462: Status 404 returned error can't find the container with id 80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462 Dec 03 19:52:37.197588 master-0 kubenswrapper[4813]: W1203 19:52:37.197504 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bce50c457ac1f4721bc81a570dd238a.slice/crio-fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3 WatchSource:0}: Error finding container fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3: Status 404 returned error can't find the container with id fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3 Dec 03 19:52:37.214735 master-0 kubenswrapper[4813]: E1203 19:52:37.214650 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 03 19:52:37.251219 master-0 kubenswrapper[4813]: W1203 19:52:37.251082 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:37.251219 master-0 kubenswrapper[4813]: E1203 19:52:37.251208 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:37.379976 master-0 kubenswrapper[4813]: W1203 19:52:37.379808 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:37.379976 master-0 kubenswrapper[4813]: E1203 19:52:37.379900 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:37.467045 master-0 kubenswrapper[4813]: I1203 19:52:37.466952 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:37.468500 master-0 kubenswrapper[4813]: I1203 19:52:37.468443 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:37.468500 master-0 kubenswrapper[4813]: I1203 19:52:37.468481 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:37.468500 master-0 kubenswrapper[4813]: I1203 19:52:37.468489 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:37.468731 master-0 kubenswrapper[4813]: I1203 19:52:37.468536 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:37.469426 master-0 kubenswrapper[4813]: E1203 19:52:37.469344 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:37.803384 master-0 kubenswrapper[4813]: I1203 19:52:37.803220 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:37.855051 master-0 kubenswrapper[4813]: I1203 19:52:37.854983 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 19:52:37.857504 master-0 kubenswrapper[4813]: E1203 19:52:37.857421 4813 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:38.028443 master-0 kubenswrapper[4813]: I1203 19:52:38.028321 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3"} Dec 03 19:52:38.029660 master-0 kubenswrapper[4813]: I1203 19:52:38.029619 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462"} Dec 03 19:52:38.031178 master-0 kubenswrapper[4813]: I1203 19:52:38.031146 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382"} Dec 03 19:52:38.032180 master-0 kubenswrapper[4813]: I1203 19:52:38.032141 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301"} Dec 03 19:52:38.033042 master-0 kubenswrapper[4813]: I1203 19:52:38.033005 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26"} Dec 03 19:52:38.802938 master-0 kubenswrapper[4813]: I1203 19:52:38.802818 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:38.815957 master-0 kubenswrapper[4813]: E1203 19:52:38.815901 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 03 19:52:39.034896 master-0 kubenswrapper[4813]: W1203 19:52:39.034825 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:39.034896 master-0 kubenswrapper[4813]: E1203 19:52:39.034892 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:39.069577 master-0 kubenswrapper[4813]: I1203 19:52:39.069453 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:39.070392 master-0 kubenswrapper[4813]: W1203 19:52:39.070311 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:39.070392 master-0 kubenswrapper[4813]: E1203 19:52:39.070391 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:39.070922 master-0 kubenswrapper[4813]: I1203 19:52:39.070896 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:39.071010 master-0 kubenswrapper[4813]: I1203 19:52:39.070943 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:39.071010 master-0 kubenswrapper[4813]: I1203 19:52:39.070956 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:39.071010 master-0 kubenswrapper[4813]: I1203 19:52:39.071007 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:39.071632 master-0 kubenswrapper[4813]: E1203 19:52:39.071590 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:39.591415 master-0 kubenswrapper[4813]: W1203 19:52:39.591353 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:39.591415 master-0 kubenswrapper[4813]: E1203 19:52:39.591409 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:39.802738 master-0 kubenswrapper[4813]: I1203 19:52:39.802682 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:40.086679 master-0 kubenswrapper[4813]: E1203 19:52:40.086515 4813 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187dcc91da19634b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,LastTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:52:40.244306 master-0 kubenswrapper[4813]: W1203 19:52:40.244225 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:40.244402 master-0 kubenswrapper[4813]: E1203 19:52:40.244305 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:40.804458 master-0 kubenswrapper[4813]: I1203 19:52:40.804185 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:41.041533 master-0 kubenswrapper[4813]: I1203 19:52:41.041354 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"05747084f9e49c9f0d255ef42ef3e83cd2a8abb1990c562931e3ac0ccc06b877"} Dec 03 19:52:41.041533 master-0 kubenswrapper[4813]: I1203 19:52:41.041400 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:41.041533 master-0 kubenswrapper[4813]: I1203 19:52:41.041413 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"fc327643e61db9d9337a443f21096010694e550ffc71b3be3921aca847fdd4bd"} Dec 03 19:52:41.042372 master-0 kubenswrapper[4813]: I1203 19:52:41.042318 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:41.042465 master-0 kubenswrapper[4813]: I1203 19:52:41.042378 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:41.042465 master-0 kubenswrapper[4813]: I1203 19:52:41.042390 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:41.043226 master-0 kubenswrapper[4813]: I1203 19:52:41.043175 4813 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c" exitCode=0 Dec 03 19:52:41.043226 master-0 kubenswrapper[4813]: I1203 19:52:41.043216 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c"} Dec 03 19:52:41.043427 master-0 kubenswrapper[4813]: I1203 19:52:41.043242 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:41.044115 master-0 kubenswrapper[4813]: I1203 19:52:41.044063 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:41.044115 master-0 kubenswrapper[4813]: I1203 19:52:41.044098 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:41.044115 master-0 kubenswrapper[4813]: I1203 19:52:41.044107 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:41.803131 master-0 kubenswrapper[4813]: I1203 19:52:41.803069 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:41.999646 master-0 kubenswrapper[4813]: I1203 19:52:41.999540 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 19:52:42.001081 master-0 kubenswrapper[4813]: E1203 19:52:42.001016 4813 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:42.017345 master-0 kubenswrapper[4813]: E1203 19:52:42.017275 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 03 19:52:42.047568 master-0 kubenswrapper[4813]: I1203 19:52:42.047483 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/0.log" Dec 03 19:52:42.048154 master-0 kubenswrapper[4813]: I1203 19:52:42.048105 4813 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="35a1127c6be591c3ca1a0386f5fc42fa23b00bc4bead7ebaf24887c23eddff59" exitCode=1 Dec 03 19:52:42.048261 master-0 kubenswrapper[4813]: I1203 19:52:42.048161 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"35a1127c6be591c3ca1a0386f5fc42fa23b00bc4bead7ebaf24887c23eddff59"} Dec 03 19:52:42.048261 master-0 kubenswrapper[4813]: I1203 19:52:42.048222 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:42.048382 master-0 kubenswrapper[4813]: I1203 19:52:42.048193 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.052489 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.052541 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.052558 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.052979 4813 scope.go:117] "RemoveContainer" containerID="35a1127c6be591c3ca1a0386f5fc42fa23b00bc4bead7ebaf24887c23eddff59" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.053206 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.053261 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:42.054097 master-0 kubenswrapper[4813]: I1203 19:52:42.053285 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:42.272648 master-0 kubenswrapper[4813]: I1203 19:52:42.272539 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:42.273849 master-0 kubenswrapper[4813]: I1203 19:52:42.273822 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:42.273849 master-0 kubenswrapper[4813]: I1203 19:52:42.273852 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:42.274044 master-0 kubenswrapper[4813]: I1203 19:52:42.273863 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:42.274044 master-0 kubenswrapper[4813]: I1203 19:52:42.273908 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:42.274684 master-0 kubenswrapper[4813]: E1203 19:52:42.274620 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:42.803990 master-0 kubenswrapper[4813]: I1203 19:52:42.803937 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:42.808688 master-0 kubenswrapper[4813]: W1203 19:52:42.808640 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:42.808747 master-0 kubenswrapper[4813]: E1203 19:52:42.808708 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:43.259090 master-0 kubenswrapper[4813]: W1203 19:52:43.258846 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:43.259090 master-0 kubenswrapper[4813]: E1203 19:52:43.258956 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:43.803556 master-0 kubenswrapper[4813]: I1203 19:52:43.803494 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:44.626338 master-0 kubenswrapper[4813]: W1203 19:52:44.626198 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:44.626338 master-0 kubenswrapper[4813]: E1203 19:52:44.626318 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:44.805605 master-0 kubenswrapper[4813]: I1203 19:52:44.805497 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:44.937426 master-0 kubenswrapper[4813]: W1203 19:52:44.937232 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:44.937426 master-0 kubenswrapper[4813]: E1203 19:52:44.937339 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:45.806113 master-0 kubenswrapper[4813]: I1203 19:52:45.805993 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:45.953938 master-0 kubenswrapper[4813]: E1203 19:52:45.953876 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:52:46.060153 master-0 kubenswrapper[4813]: I1203 19:52:46.059940 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/0.log" Dec 03 19:52:46.060537 master-0 kubenswrapper[4813]: I1203 19:52:46.060471 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380"} Dec 03 19:52:46.804362 master-0 kubenswrapper[4813]: I1203 19:52:46.804284 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:47.066191 master-0 kubenswrapper[4813]: I1203 19:52:47.066005 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 19:52:47.067014 master-0 kubenswrapper[4813]: I1203 19:52:47.066674 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/0.log" Dec 03 19:52:47.067373 master-0 kubenswrapper[4813]: I1203 19:52:47.067301 4813 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380" exitCode=1 Dec 03 19:52:47.067453 master-0 kubenswrapper[4813]: I1203 19:52:47.067367 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380"} Dec 03 19:52:47.067453 master-0 kubenswrapper[4813]: I1203 19:52:47.067439 4813 scope.go:117] "RemoveContainer" containerID="35a1127c6be591c3ca1a0386f5fc42fa23b00bc4bead7ebaf24887c23eddff59" Dec 03 19:52:47.067576 master-0 kubenswrapper[4813]: I1203 19:52:47.067472 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:47.069004 master-0 kubenswrapper[4813]: I1203 19:52:47.068950 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:47.069136 master-0 kubenswrapper[4813]: I1203 19:52:47.069009 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:47.069136 master-0 kubenswrapper[4813]: I1203 19:52:47.069048 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:47.070194 master-0 kubenswrapper[4813]: I1203 19:52:47.069609 4813 scope.go:117] "RemoveContainer" containerID="448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380" Dec 03 19:52:47.070194 master-0 kubenswrapper[4813]: E1203 19:52:47.069924 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:52:47.803767 master-0 kubenswrapper[4813]: I1203 19:52:47.803565 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:48.072665 master-0 kubenswrapper[4813]: I1203 19:52:48.072456 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 19:52:48.073883 master-0 kubenswrapper[4813]: I1203 19:52:48.073837 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:48.074992 master-0 kubenswrapper[4813]: I1203 19:52:48.074907 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:48.074992 master-0 kubenswrapper[4813]: I1203 19:52:48.074967 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:48.074992 master-0 kubenswrapper[4813]: I1203 19:52:48.074985 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:48.075469 master-0 kubenswrapper[4813]: I1203 19:52:48.075436 4813 scope.go:117] "RemoveContainer" containerID="448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380" Dec 03 19:52:48.075718 master-0 kubenswrapper[4813]: E1203 19:52:48.075658 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:52:48.419913 master-0 kubenswrapper[4813]: E1203 19:52:48.419765 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Dec 03 19:52:48.675880 master-0 kubenswrapper[4813]: I1203 19:52:48.675625 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:48.677310 master-0 kubenswrapper[4813]: I1203 19:52:48.677198 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:48.677310 master-0 kubenswrapper[4813]: I1203 19:52:48.677260 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:48.677310 master-0 kubenswrapper[4813]: I1203 19:52:48.677284 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:48.677668 master-0 kubenswrapper[4813]: I1203 19:52:48.677370 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:48.678376 master-0 kubenswrapper[4813]: E1203 19:52:48.678314 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 19:52:48.803858 master-0 kubenswrapper[4813]: I1203 19:52:48.803704 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:49.803634 master-0 kubenswrapper[4813]: I1203 19:52:49.803581 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:50.088259 master-0 kubenswrapper[4813]: E1203 19:52:50.087956 4813 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187dcc91da19634b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,LastTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:52:50.109985 master-0 kubenswrapper[4813]: W1203 19:52:50.109891 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:50.109985 master-0 kubenswrapper[4813]: E1203 19:52:50.109968 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:50.542451 master-0 kubenswrapper[4813]: I1203 19:52:50.542339 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 19:52:50.543819 master-0 kubenswrapper[4813]: E1203 19:52:50.543760 4813 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 19:52:50.804732 master-0 kubenswrapper[4813]: I1203 19:52:50.804462 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:51.803887 master-0 kubenswrapper[4813]: I1203 19:52:51.803832 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 19:52:52.084457 master-0 kubenswrapper[4813]: I1203 19:52:52.084406 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630"} Dec 03 19:52:52.085422 master-0 kubenswrapper[4813]: I1203 19:52:52.085392 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"ca335c8e4de4141862b380dce4757695adee236b409b9c589070127007153500"} Dec 03 19:52:52.085531 master-0 kubenswrapper[4813]: I1203 19:52:52.085502 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:52.086218 master-0 kubenswrapper[4813]: I1203 19:52:52.086191 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:52.086218 master-0 kubenswrapper[4813]: I1203 19:52:52.086218 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:52.086324 master-0 kubenswrapper[4813]: I1203 19:52:52.086229 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:52.086833 master-0 kubenswrapper[4813]: I1203 19:52:52.086799 4813 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6" exitCode=0 Dec 03 19:52:52.086897 master-0 kubenswrapper[4813]: I1203 19:52:52.086837 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerDied","Data":"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6"} Dec 03 19:52:52.086897 master-0 kubenswrapper[4813]: I1203 19:52:52.086846 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:52.087584 master-0 kubenswrapper[4813]: I1203 19:52:52.087546 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:52.087622 master-0 kubenswrapper[4813]: I1203 19:52:52.087603 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:52.087648 master-0 kubenswrapper[4813]: I1203 19:52:52.087621 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:52.090358 master-0 kubenswrapper[4813]: I1203 19:52:52.090335 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:52.091012 master-0 kubenswrapper[4813]: I1203 19:52:52.090974 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:52.091062 master-0 kubenswrapper[4813]: I1203 19:52:52.091030 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:52.091062 master-0 kubenswrapper[4813]: I1203 19:52:52.091051 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:53.091465 master-0 kubenswrapper[4813]: I1203 19:52:53.091033 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3"} Dec 03 19:52:53.091465 master-0 kubenswrapper[4813]: I1203 19:52:53.091056 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:53.092400 master-0 kubenswrapper[4813]: I1203 19:52:53.092369 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:53.092448 master-0 kubenswrapper[4813]: I1203 19:52:53.092404 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:53.092448 master-0 kubenswrapper[4813]: I1203 19:52:53.092414 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:53.607453 master-0 kubenswrapper[4813]: I1203 19:52:53.607410 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:53.808745 master-0 kubenswrapper[4813]: I1203 19:52:53.808706 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:54.013912 master-0 kubenswrapper[4813]: W1203 19:52:54.013852 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 03 19:52:54.014153 master-0 kubenswrapper[4813]: E1203 19:52:54.013948 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 19:52:54.808021 master-0 kubenswrapper[4813]: I1203 19:52:54.807989 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:55.097694 master-0 kubenswrapper[4813]: I1203 19:52:55.097537 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67"} Dec 03 19:52:55.097920 master-0 kubenswrapper[4813]: I1203 19:52:55.097740 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:55.098834 master-0 kubenswrapper[4813]: I1203 19:52:55.098747 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:55.098834 master-0 kubenswrapper[4813]: I1203 19:52:55.098833 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:55.099017 master-0 kubenswrapper[4813]: I1203 19:52:55.098857 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:55.101927 master-0 kubenswrapper[4813]: I1203 19:52:55.101874 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc"} Dec 03 19:52:55.102127 master-0 kubenswrapper[4813]: I1203 19:52:55.102012 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:55.103005 master-0 kubenswrapper[4813]: I1203 19:52:55.102955 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:55.103099 master-0 kubenswrapper[4813]: I1203 19:52:55.103012 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:55.103099 master-0 kubenswrapper[4813]: I1203 19:52:55.103034 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:55.427879 master-0 kubenswrapper[4813]: E1203 19:52:55.427770 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 19:52:55.679111 master-0 kubenswrapper[4813]: I1203 19:52:55.678876 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:55.680667 master-0 kubenswrapper[4813]: I1203 19:52:55.680610 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:55.680839 master-0 kubenswrapper[4813]: I1203 19:52:55.680674 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:55.680839 master-0 kubenswrapper[4813]: I1203 19:52:55.680692 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:55.680839 master-0 kubenswrapper[4813]: I1203 19:52:55.680754 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:52:55.688275 master-0 kubenswrapper[4813]: E1203 19:52:55.688197 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 19:52:55.810663 master-0 kubenswrapper[4813]: I1203 19:52:55.810602 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:55.955115 master-0 kubenswrapper[4813]: E1203 19:52:55.954977 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:52:56.103809 master-0 kubenswrapper[4813]: I1203 19:52:56.103756 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:56.103997 master-0 kubenswrapper[4813]: I1203 19:52:56.103756 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:56.104523 master-0 kubenswrapper[4813]: I1203 19:52:56.104486 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:56.104523 master-0 kubenswrapper[4813]: I1203 19:52:56.104514 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:56.104523 master-0 kubenswrapper[4813]: I1203 19:52:56.104525 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:56.104978 master-0 kubenswrapper[4813]: I1203 19:52:56.104929 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:56.104978 master-0 kubenswrapper[4813]: I1203 19:52:56.104967 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:56.104978 master-0 kubenswrapper[4813]: I1203 19:52:56.104982 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:56.329756 master-0 kubenswrapper[4813]: I1203 19:52:56.329555 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:56.335727 master-0 kubenswrapper[4813]: I1203 19:52:56.335675 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:56.721819 master-0 kubenswrapper[4813]: W1203 19:52:56.721738 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 03 19:52:56.722042 master-0 kubenswrapper[4813]: E1203 19:52:56.721856 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 19:52:56.810227 master-0 kubenswrapper[4813]: I1203 19:52:56.810108 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:57.106946 master-0 kubenswrapper[4813]: I1203 19:52:57.106755 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:57.106946 master-0 kubenswrapper[4813]: I1203 19:52:57.106872 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:57.108266 master-0 kubenswrapper[4813]: I1203 19:52:57.108199 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:57.108399 master-0 kubenswrapper[4813]: I1203 19:52:57.108268 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:57.108399 master-0 kubenswrapper[4813]: I1203 19:52:57.108312 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:57.388443 master-0 kubenswrapper[4813]: W1203 19:52:57.388375 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:57.388590 master-0 kubenswrapper[4813]: E1203 19:52:57.388459 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 19:52:57.808596 master-0 kubenswrapper[4813]: I1203 19:52:57.808476 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:58.109116 master-0 kubenswrapper[4813]: I1203 19:52:58.109047 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:58.110161 master-0 kubenswrapper[4813]: I1203 19:52:58.110079 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:58.110161 master-0 kubenswrapper[4813]: I1203 19:52:58.110118 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:58.110161 master-0 kubenswrapper[4813]: I1203 19:52:58.110134 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:58.809024 master-0 kubenswrapper[4813]: I1203 19:52:58.808964 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:58.869639 master-0 kubenswrapper[4813]: I1203 19:52:58.869509 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:58.870127 master-0 kubenswrapper[4813]: I1203 19:52:58.869722 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:58.871161 master-0 kubenswrapper[4813]: I1203 19:52:58.871082 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:58.871161 master-0 kubenswrapper[4813]: I1203 19:52:58.871106 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:58.871161 master-0 kubenswrapper[4813]: I1203 19:52:58.871116 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:58.876101 master-0 kubenswrapper[4813]: I1203 19:52:58.876037 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:59.110589 master-0 kubenswrapper[4813]: I1203 19:52:59.110518 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:59.111235 master-0 kubenswrapper[4813]: I1203 19:52:59.110725 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:59.111656 master-0 kubenswrapper[4813]: I1203 19:52:59.111608 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:59.111720 master-0 kubenswrapper[4813]: I1203 19:52:59.111664 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:59.111720 master-0 kubenswrapper[4813]: I1203 19:52:59.111687 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:52:59.116100 master-0 kubenswrapper[4813]: I1203 19:52:59.116046 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:52:59.810614 master-0 kubenswrapper[4813]: I1203 19:52:59.810494 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:52:59.846203 master-0 kubenswrapper[4813]: I1203 19:52:59.846102 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:52:59.846463 master-0 kubenswrapper[4813]: I1203 19:52:59.846263 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:52:59.847549 master-0 kubenswrapper[4813]: I1203 19:52:59.847480 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:52:59.847631 master-0 kubenswrapper[4813]: I1203 19:52:59.847560 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:52:59.847631 master-0 kubenswrapper[4813]: I1203 19:52:59.847586 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:00.096220 master-0 kubenswrapper[4813]: E1203 19:53:00.095889 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91da19634b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,LastTimestamp:2025-12-03 19:52:35.800523595 +0000 UTC m=+0.229322084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.102536 master-0 kubenswrapper[4813]: E1203 19:53:00.102370 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.109177 master-0 kubenswrapper[4813]: E1203 19:53:00.109016 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.113158 master-0 kubenswrapper[4813]: I1203 19:53:00.113100 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:00.113903 master-0 kubenswrapper[4813]: I1203 19:53:00.113842 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:00.113903 master-0 kubenswrapper[4813]: I1203 19:53:00.113887 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:00.113903 master-0 kubenswrapper[4813]: I1203 19:53:00.113899 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:00.115994 master-0 kubenswrapper[4813]: E1203 19:53:00.115821 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.120972 master-0 kubenswrapper[4813]: E1203 19:53:00.120862 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91e35d7d38 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.955981624 +0000 UTC m=+0.384780103,LastTimestamp:2025-12-03 19:52:35.955981624 +0000 UTC m=+0.384780103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.125978 master-0 kubenswrapper[4813]: E1203 19:53:00.125800 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.053598527 +0000 UTC m=+0.482397006,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.132377 master-0 kubenswrapper[4813]: E1203 19:53:00.132229 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.053640167 +0000 UTC m=+0.482438656,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.136849 master-0 kubenswrapper[4813]: E1203 19:53:00.136675 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.053663358 +0000 UTC m=+0.482461857,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.141007 master-0 kubenswrapper[4813]: E1203 19:53:00.140889 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.125220396 +0000 UTC m=+0.554018885,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.147661 master-0 kubenswrapper[4813]: E1203 19:53:00.147545 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.125293307 +0000 UTC m=+0.554091786,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.154226 master-0 kubenswrapper[4813]: E1203 19:53:00.153989 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.125309758 +0000 UTC m=+0.554108237,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.161180 master-0 kubenswrapper[4813]: E1203 19:53:00.161033 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.126477906 +0000 UTC m=+0.555276395,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.167759 master-0 kubenswrapper[4813]: E1203 19:53:00.167617 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.126527867 +0000 UTC m=+0.555326366,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.174522 master-0 kubenswrapper[4813]: E1203 19:53:00.174347 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.126551297 +0000 UTC m=+0.555349796,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.181238 master-0 kubenswrapper[4813]: E1203 19:53:00.181165 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.12735151 +0000 UTC m=+0.556149989,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.186025 master-0 kubenswrapper[4813]: E1203 19:53:00.185950 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.12738619 +0000 UTC m=+0.556184679,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.190479 master-0 kubenswrapper[4813]: E1203 19:53:00.190384 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.12740168 +0000 UTC m=+0.556200159,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.195066 master-0 kubenswrapper[4813]: E1203 19:53:00.194932 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.128272403 +0000 UTC m=+0.557070882,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.203044 master-0 kubenswrapper[4813]: E1203 19:53:00.202907 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.128302464 +0000 UTC m=+0.557100943,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.208503 master-0 kubenswrapper[4813]: E1203 19:53:00.208397 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.128318314 +0000 UTC m=+0.557116803,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.213770 master-0 kubenswrapper[4813]: E1203 19:53:00.213624 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.128494808 +0000 UTC m=+0.557293297,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.222959 master-0 kubenswrapper[4813]: E1203 19:53:00.222819 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.128527148 +0000 UTC m=+0.557325647,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.228725 master-0 kubenswrapper[4813]: E1203 19:53:00.228607 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde57bd8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde57bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864230872 +0000 UTC m=+0.293029361,LastTimestamp:2025-12-03 19:52:36.128550338 +0000 UTC m=+0.557348827,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.234502 master-0 kubenswrapper[4813]: E1203 19:53:00.234327 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde4970e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde4970e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864172302 +0000 UTC m=+0.292970791,LastTimestamp:2025-12-03 19:52:36.129684735 +0000 UTC m=+0.558483234,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.239683 master-0 kubenswrapper[4813]: E1203 19:53:00.239534 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dcc91dde52ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dcc91dde52ae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:35.864210152 +0000 UTC m=+0.293008641,LastTimestamp:2025-12-03 19:52:36.129715506 +0000 UTC m=+0.558514005,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.252097 master-0 kubenswrapper[4813]: E1203 19:53:00.251957 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc922a15f7e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:37.14247677 +0000 UTC m=+1.571275259,LastTimestamp:2025-12-03 19:52:37.14247677 +0000 UTC m=+1.571275259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.259065 master-0 kubenswrapper[4813]: E1203 19:53:00.258910 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc922b6c4b86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:37.164911494 +0000 UTC m=+1.593709963,LastTimestamp:2025-12-03 19:52:37.164911494 +0000 UTC m=+1.593709963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.265555 master-0 kubenswrapper[4813]: E1203 19:53:00.265397 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dcc922c8ad65d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:37.183690333 +0000 UTC m=+1.612488782,LastTimestamp:2025-12-03 19:52:37.183690333 +0000 UTC m=+1.612488782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.272434 master-0 kubenswrapper[4813]: E1203 19:53:00.272286 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc922dc64581 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:37.204362625 +0000 UTC m=+1.633161114,LastTimestamp:2025-12-03 19:52:37.204362625 +0000 UTC m=+1.633161114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.279052 master-0 kubenswrapper[4813]: E1203 19:53:00.278925 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc922ebbd330 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:37.220455216 +0000 UTC m=+1.649253705,LastTimestamp:2025-12-03 19:52:37.220455216 +0000 UTC m=+1.649253705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.285322 master-0 kubenswrapper[4813]: E1203 19:53:00.285195 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc92e5d9af48 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" in 3.072s (3.072s including waiting). Image size: 459566623 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.292642632 +0000 UTC m=+4.721441081,LastTimestamp:2025-12-03 19:52:40.292642632 +0000 UTC m=+4.721441081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.291981 master-0 kubenswrapper[4813]: E1203 19:53:00.291867 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92e705a783 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\" in 3.169s (3.169s including waiting). Image size: 532668041 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.312301443 +0000 UTC m=+4.741099902,LastTimestamp:2025-12-03 19:52:40.312301443 +0000 UTC m=+4.741099902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.296925 master-0 kubenswrapper[4813]: E1203 19:53:00.296729 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92f1efaca5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.495410341 +0000 UTC m=+4.924208790,LastTimestamp:2025-12-03 19:52:40.495410341 +0000 UTC m=+4.924208790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.302901 master-0 kubenswrapper[4813]: E1203 19:53:00.302701 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc92f1f16caa openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.495525034 +0000 UTC m=+4.924323483,LastTimestamp:2025-12-03 19:52:40.495525034 +0000 UTC m=+4.924323483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.307329 master-0 kubenswrapper[4813]: E1203 19:53:00.307213 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92f2a17ffc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.507064316 +0000 UTC m=+4.935862765,LastTimestamp:2025-12-03 19:52:40.507064316 +0000 UTC m=+4.935862765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.312443 master-0 kubenswrapper[4813]: E1203 19:53:00.312302 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc92f2c479b3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.509356467 +0000 UTC m=+4.938154916,LastTimestamp:2025-12-03 19:52:40.509356467 +0000 UTC m=+4.938154916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.317585 master-0 kubenswrapper[4813]: E1203 19:53:00.317436 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92f2c65be5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.509479909 +0000 UTC m=+4.938278368,LastTimestamp:2025-12-03 19:52:40.509479909 +0000 UTC m=+4.938278368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.322435 master-0 kubenswrapper[4813]: E1203 19:53:00.322306 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92fd02d8a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.681216166 +0000 UTC m=+5.110014615,LastTimestamp:2025-12-03 19:52:40.681216166 +0000 UTC m=+5.110014615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.328579 master-0 kubenswrapper[4813]: E1203 19:53:00.328473 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dcc92fd82641c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:40.68957494 +0000 UTC m=+5.118373389,LastTimestamp:2025-12-03 19:52:40.68957494 +0000 UTC m=+5.118373389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.334789 master-0 kubenswrapper[4813]: E1203 19:53:00.334710 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9312c869b2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.046485426 +0000 UTC m=+5.475283885,LastTimestamp:2025-12-03 19:52:41.046485426 +0000 UTC m=+5.475283885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.340329 master-0 kubenswrapper[4813]: E1203 19:53:00.340249 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931e21dcab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.236896939 +0000 UTC m=+5.665695388,LastTimestamp:2025-12-03 19:52:41.236896939 +0000 UTC m=+5.665695388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.344887 master-0 kubenswrapper[4813]: E1203 19:53:00.344791 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931ed1a3e7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.248416743 +0000 UTC m=+5.677215192,LastTimestamp:2025-12-03 19:52:41.248416743 +0000 UTC m=+5.677215192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.349447 master-0 kubenswrapper[4813]: E1203 19:53:00.349333 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc9312c869b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9312c869b2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.046485426 +0000 UTC m=+5.475283885,LastTimestamp:2025-12-03 19:52:44.950266654 +0000 UTC m=+9.379065103,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.355664 master-0 kubenswrapper[4813]: E1203 19:53:00.355500 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc931e21dcab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931e21dcab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.236896939 +0000 UTC m=+5.665695388,LastTimestamp:2025-12-03 19:52:45.946942928 +0000 UTC m=+10.375741417,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.359436 master-0 kubenswrapper[4813]: E1203 19:53:00.359277 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc931ed1a3e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931ed1a3e7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.248416743 +0000 UTC m=+5.677215192,LastTimestamp:2025-12-03 19:52:46.836852344 +0000 UTC m=+11.265650823,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.365735 master-0 kubenswrapper[4813]: E1203 19:53:00.365611 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9479cdee93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:47.069867667 +0000 UTC m=+11.498666156,LastTimestamp:2025-12-03 19:52:47.069867667 +0000 UTC m=+11.498666156,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.369496 master-0 kubenswrapper[4813]: E1203 19:53:00.369423 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc9479cdee93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9479cdee93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:47.069867667 +0000 UTC m=+11.498666156,LastTimestamp:2025-12-03 19:52:48.075614148 +0000 UTC m=+12.504412627,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.373976 master-0 kubenswrapper[4813]: E1203 19:53:00.373854 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc95894ecdd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 14.459s (14.459s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.624938965 +0000 UTC m=+16.053737454,LastTimestamp:2025-12-03 19:52:51.624938965 +0000 UTC m=+16.053737454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.378491 master-0 kubenswrapper[4813]: E1203 19:53:00.378372 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc958a7f31ba kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 14.44s (14.44s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.644887482 +0000 UTC m=+16.073685941,LastTimestamp:2025-12-03 19:52:51.644887482 +0000 UTC m=+16.073685941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.383618 master-0 kubenswrapper[4813]: E1203 19:53:00.383443 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dcc958c3649ea kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 14.489s (14.489s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.673663978 +0000 UTC m=+16.102462447,LastTimestamp:2025-12-03 19:52:51.673663978 +0000 UTC m=+16.102462447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.388425 master-0 kubenswrapper[4813]: E1203 19:53:00.388334 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc9596ce2a3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.851389501 +0000 UTC m=+16.280187950,LastTimestamp:2025-12-03 19:52:51.851389501 +0000 UTC m=+16.280187950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.391969 master-0 kubenswrapper[4813]: E1203 19:53:00.391827 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc9597964eda openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.864506074 +0000 UTC m=+16.293304523,LastTimestamp:2025-12-03 19:52:51.864506074 +0000 UTC m=+16.293304523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.396813 master-0 kubenswrapper[4813]: E1203 19:53:00.396664 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc9597a19540 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.865244992 +0000 UTC m=+16.294043441,LastTimestamp:2025-12-03 19:52:51.865244992 +0000 UTC m=+16.294043441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.402619 master-0 kubenswrapper[4813]: E1203 19:53:00.402505 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dcc95984a3230 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.876295216 +0000 UTC m=+16.305093675,LastTimestamp:2025-12-03 19:52:51.876295216 +0000 UTC m=+16.305093675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.407156 master-0 kubenswrapper[4813]: E1203 19:53:00.406989 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc95985535fb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.877017083 +0000 UTC m=+16.305815532,LastTimestamp:2025-12-03 19:52:51.877017083 +0000 UTC m=+16.305815532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.414607 master-0 kubenswrapper[4813]: E1203 19:53:00.414425 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc959862ca00 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.877906944 +0000 UTC m=+16.306705403,LastTimestamp:2025-12-03 19:52:51.877906944 +0000 UTC m=+16.306705403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.421583 master-0 kubenswrapper[4813]: E1203 19:53:00.421427 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dcc9598e904da kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:51.886703834 +0000 UTC m=+16.315502293,LastTimestamp:2025-12-03 19:52:51.886703834 +0000 UTC m=+16.315502293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.426820 master-0 kubenswrapper[4813]: E1203 19:53:00.426668 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc95a50b6303 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:52.090282755 +0000 UTC m=+16.519081224,LastTimestamp:2025-12-03 19:52:52.090282755 +0000 UTC m=+16.519081224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.431016 master-0 kubenswrapper[4813]: E1203 19:53:00.430853 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc95b31ce50b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:52.326311179 +0000 UTC m=+16.755109658,LastTimestamp:2025-12-03 19:52:52.326311179 +0000 UTC m=+16.755109658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.434968 master-0 kubenswrapper[4813]: E1203 19:53:00.434827 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc95b3eee9d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:52.340074968 +0000 UTC m=+16.768873447,LastTimestamp:2025-12-03 19:52:52.340074968 +0000 UTC m=+16.768873447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.439239 master-0 kubenswrapper[4813]: E1203 19:53:00.439125 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc95b405c1c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:52.341572034 +0000 UTC m=+16.770370523,LastTimestamp:2025-12-03 19:52:52.341572034 +0000 UTC m=+16.770370523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.444501 master-0 kubenswrapper[4813]: E1203 19:53:00.444336 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc9647044c60 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\" in 2.929s (2.929s including waiting). Image size: 499719811 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.8077272 +0000 UTC m=+19.236525649,LastTimestamp:2025-12-03 19:52:54.8077272 +0000 UTC m=+19.236525649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.449195 master-0 kubenswrapper[4813]: E1203 19:53:00.449059 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc9647b76f17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" in 2.477s (2.477s including waiting). Image size: 509451797 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.819467031 +0000 UTC m=+19.248265480,LastTimestamp:2025-12-03 19:52:54.819467031 +0000 UTC m=+19.248265480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.452218 master-0 kubenswrapper[4813]: E1203 19:53:00.452112 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc96518036b8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.98362028 +0000 UTC m=+19.412418749,LastTimestamp:2025-12-03 19:52:54.98362028 +0000 UTC m=+19.412418749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.456940 master-0 kubenswrapper[4813]: E1203 19:53:00.456160 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc9651b9ab78 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.98738572 +0000 UTC m=+19.416184179,LastTimestamp:2025-12-03 19:52:54.98738572 +0000 UTC m=+19.416184179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.460585 master-0 kubenswrapper[4813]: E1203 19:53:00.460454 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dcc9652256001 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.994444289 +0000 UTC m=+19.423242758,LastTimestamp:2025-12-03 19:52:54.994444289 +0000 UTC m=+19.423242758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.464709 master-0 kubenswrapper[4813]: E1203 19:53:00.464582 4813 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dcc96523e4b4c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:54.996077388 +0000 UTC m=+19.424875837,LastTimestamp:2025-12-03 19:52:54.996077388 +0000 UTC m=+19.424875837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:00.810356 master-0 kubenswrapper[4813]: I1203 19:53:00.810263 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:01.023561 master-0 kubenswrapper[4813]: I1203 19:53:01.023498 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:01.025073 master-0 kubenswrapper[4813]: I1203 19:53:01.025022 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:01.025185 master-0 kubenswrapper[4813]: I1203 19:53:01.025097 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:01.025185 master-0 kubenswrapper[4813]: I1203 19:53:01.025116 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:01.025659 master-0 kubenswrapper[4813]: I1203 19:53:01.025623 4813 scope.go:117] "RemoveContainer" containerID="448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380" Dec 03 19:53:01.033921 master-0 kubenswrapper[4813]: E1203 19:53:01.033528 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc9312c869b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9312c869b2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.046485426 +0000 UTC m=+5.475283885,LastTimestamp:2025-12-03 19:53:01.028060756 +0000 UTC m=+25.456859235,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:01.115258 master-0 kubenswrapper[4813]: I1203 19:53:01.115164 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:01.116429 master-0 kubenswrapper[4813]: I1203 19:53:01.115931 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:01.116429 master-0 kubenswrapper[4813]: I1203 19:53:01.115972 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:01.116429 master-0 kubenswrapper[4813]: I1203 19:53:01.115989 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:01.256574 master-0 kubenswrapper[4813]: E1203 19:53:01.256378 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc931e21dcab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931e21dcab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.236896939 +0000 UTC m=+5.665695388,LastTimestamp:2025-12-03 19:53:01.250133838 +0000 UTC m=+25.678932327,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:01.265940 master-0 kubenswrapper[4813]: E1203 19:53:01.265300 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc931ed1a3e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc931ed1a3e7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:41.248416743 +0000 UTC m=+5.677215192,LastTimestamp:2025-12-03 19:53:01.260550556 +0000 UTC m=+25.689349015,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:01.808558 master-0 kubenswrapper[4813]: I1203 19:53:01.808435 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:02.120374 master-0 kubenswrapper[4813]: I1203 19:53:02.120333 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 19:53:02.121066 master-0 kubenswrapper[4813]: I1203 19:53:02.120885 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 19:53:02.121455 master-0 kubenswrapper[4813]: I1203 19:53:02.121416 4813 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b" exitCode=1 Dec 03 19:53:02.121505 master-0 kubenswrapper[4813]: I1203 19:53:02.121453 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b"} Dec 03 19:53:02.121505 master-0 kubenswrapper[4813]: I1203 19:53:02.121488 4813 scope.go:117] "RemoveContainer" containerID="448bf0aad34244395721a7bea2c8730e9a87d7dd5059f2a2e11feb2e9df02380" Dec 03 19:53:02.121682 master-0 kubenswrapper[4813]: I1203 19:53:02.121639 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:02.122734 master-0 kubenswrapper[4813]: I1203 19:53:02.122701 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:02.122734 master-0 kubenswrapper[4813]: I1203 19:53:02.122732 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:02.122833 master-0 kubenswrapper[4813]: I1203 19:53:02.122744 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:02.123086 master-0 kubenswrapper[4813]: I1203 19:53:02.123058 4813 scope.go:117] "RemoveContainer" containerID="9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b" Dec 03 19:53:02.123244 master-0 kubenswrapper[4813]: E1203 19:53:02.123210 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:02.127755 master-0 kubenswrapper[4813]: E1203 19:53:02.127541 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc9479cdee93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9479cdee93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:47.069867667 +0000 UTC m=+11.498666156,LastTimestamp:2025-12-03 19:53:02.123183221 +0000 UTC m=+26.551981680,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:02.424297 master-0 kubenswrapper[4813]: I1203 19:53:02.424166 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:53:02.424297 master-0 kubenswrapper[4813]: I1203 19:53:02.424275 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:02.426541 master-0 kubenswrapper[4813]: I1203 19:53:02.426487 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:02.426541 master-0 kubenswrapper[4813]: I1203 19:53:02.426536 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:02.426634 master-0 kubenswrapper[4813]: I1203 19:53:02.426547 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:02.434514 master-0 kubenswrapper[4813]: E1203 19:53:02.434462 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 19:53:02.689303 master-0 kubenswrapper[4813]: I1203 19:53:02.689093 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:02.691260 master-0 kubenswrapper[4813]: I1203 19:53:02.691172 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:02.691260 master-0 kubenswrapper[4813]: I1203 19:53:02.691250 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:02.691490 master-0 kubenswrapper[4813]: I1203 19:53:02.691282 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:02.691490 master-0 kubenswrapper[4813]: I1203 19:53:02.691360 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:53:02.702030 master-0 kubenswrapper[4813]: E1203 19:53:02.701960 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 19:53:02.809607 master-0 kubenswrapper[4813]: I1203 19:53:02.809519 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:03.125303 master-0 kubenswrapper[4813]: I1203 19:53:03.125237 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 19:53:03.812095 master-0 kubenswrapper[4813]: I1203 19:53:03.812006 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:04.263128 master-0 kubenswrapper[4813]: I1203 19:53:04.263050 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:53:04.263847 master-0 kubenswrapper[4813]: I1203 19:53:04.263251 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:04.264735 master-0 kubenswrapper[4813]: I1203 19:53:04.264666 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:04.264735 master-0 kubenswrapper[4813]: I1203 19:53:04.264719 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:04.264735 master-0 kubenswrapper[4813]: I1203 19:53:04.264736 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:04.271248 master-0 kubenswrapper[4813]: I1203 19:53:04.271159 4813 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:53:04.807845 master-0 kubenswrapper[4813]: I1203 19:53:04.807678 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:05.130386 master-0 kubenswrapper[4813]: I1203 19:53:05.130344 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:05.131064 master-0 kubenswrapper[4813]: I1203 19:53:05.131002 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:05.131064 master-0 kubenswrapper[4813]: I1203 19:53:05.131039 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:05.131064 master-0 kubenswrapper[4813]: I1203 19:53:05.131048 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:05.138057 master-0 kubenswrapper[4813]: I1203 19:53:05.138000 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:53:05.807664 master-0 kubenswrapper[4813]: I1203 19:53:05.807566 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:05.955623 master-0 kubenswrapper[4813]: E1203 19:53:05.955523 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:53:06.133045 master-0 kubenswrapper[4813]: I1203 19:53:06.132954 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:06.134172 master-0 kubenswrapper[4813]: I1203 19:53:06.134116 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:06.134172 master-0 kubenswrapper[4813]: I1203 19:53:06.134155 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:06.134172 master-0 kubenswrapper[4813]: I1203 19:53:06.134170 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:06.809883 master-0 kubenswrapper[4813]: I1203 19:53:06.809744 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:06.984129 master-0 kubenswrapper[4813]: W1203 19:53:06.984015 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 03 19:53:06.984129 master-0 kubenswrapper[4813]: E1203 19:53:06.984091 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 19:53:07.810259 master-0 kubenswrapper[4813]: I1203 19:53:07.810118 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:07.965391 master-0 kubenswrapper[4813]: I1203 19:53:07.965298 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 19:53:07.982609 master-0 kubenswrapper[4813]: I1203 19:53:07.982541 4813 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 19:53:08.809622 master-0 kubenswrapper[4813]: I1203 19:53:08.809503 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:09.443043 master-0 kubenswrapper[4813]: E1203 19:53:09.442894 4813 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 19:53:09.703359 master-0 kubenswrapper[4813]: I1203 19:53:09.703136 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:09.704488 master-0 kubenswrapper[4813]: I1203 19:53:09.704432 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:09.704488 master-0 kubenswrapper[4813]: I1203 19:53:09.704474 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:09.704488 master-0 kubenswrapper[4813]: I1203 19:53:09.704485 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:09.704773 master-0 kubenswrapper[4813]: I1203 19:53:09.704532 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:53:09.709868 master-0 kubenswrapper[4813]: E1203 19:53:09.709813 4813 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 19:53:09.810825 master-0 kubenswrapper[4813]: I1203 19:53:09.810674 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:10.812124 master-0 kubenswrapper[4813]: I1203 19:53:10.812020 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:11.807878 master-0 kubenswrapper[4813]: I1203 19:53:11.807829 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:12.808752 master-0 kubenswrapper[4813]: I1203 19:53:12.808670 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:13.023626 master-0 kubenswrapper[4813]: I1203 19:53:13.023532 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:13.025126 master-0 kubenswrapper[4813]: I1203 19:53:13.025045 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:13.025126 master-0 kubenswrapper[4813]: I1203 19:53:13.025129 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:13.025401 master-0 kubenswrapper[4813]: I1203 19:53:13.025147 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:13.025752 master-0 kubenswrapper[4813]: I1203 19:53:13.025699 4813 scope.go:117] "RemoveContainer" containerID="9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b" Dec 03 19:53:13.026031 master-0 kubenswrapper[4813]: E1203 19:53:13.025969 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:13.033676 master-0 kubenswrapper[4813]: E1203 19:53:13.033457 4813 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dcc9479cdee93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dcc9479cdee93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:52:47.069867667 +0000 UTC m=+11.498666156,LastTimestamp:2025-12-03 19:53:13.025923635 +0000 UTC m=+37.454722124,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:53:13.451062 master-0 kubenswrapper[4813]: W1203 19:53:13.450910 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 03 19:53:13.451062 master-0 kubenswrapper[4813]: E1203 19:53:13.451017 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 19:53:13.776590 master-0 kubenswrapper[4813]: W1203 19:53:13.776425 4813 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 03 19:53:13.776590 master-0 kubenswrapper[4813]: E1203 19:53:13.776498 4813 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 19:53:13.810519 master-0 kubenswrapper[4813]: I1203 19:53:13.810411 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:14.809050 master-0 kubenswrapper[4813]: I1203 19:53:14.808847 4813 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 19:53:15.035377 master-0 kubenswrapper[4813]: I1203 19:53:15.035235 4813 csr.go:261] certificate signing request csr-xx74h is approved, waiting to be issued Dec 03 19:53:15.042171 master-0 kubenswrapper[4813]: I1203 19:53:15.042068 4813 csr.go:257] certificate signing request csr-xx74h is issued Dec 03 19:53:15.724143 master-0 kubenswrapper[4813]: I1203 19:53:15.724045 4813 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 03 19:53:15.818056 master-0 kubenswrapper[4813]: I1203 19:53:15.817964 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:15.833454 master-0 kubenswrapper[4813]: I1203 19:53:15.833391 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:15.897994 master-0 kubenswrapper[4813]: I1203 19:53:15.897940 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:15.956267 master-0 kubenswrapper[4813]: E1203 19:53:15.956194 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:53:16.044843 master-0 kubenswrapper[4813]: I1203 19:53:16.044635 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 16:52:20.065281411 +0000 UTC Dec 03 19:53:16.044843 master-0 kubenswrapper[4813]: I1203 19:53:16.044692 4813 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h59m4.020596315s for next certificate rotation Dec 03 19:53:16.154168 master-0 kubenswrapper[4813]: I1203 19:53:16.154083 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.154168 master-0 kubenswrapper[4813]: E1203 19:53:16.154134 4813 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 03 19:53:16.175258 master-0 kubenswrapper[4813]: I1203 19:53:16.175152 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.191820 master-0 kubenswrapper[4813]: I1203 19:53:16.191739 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.252000 master-0 kubenswrapper[4813]: I1203 19:53:16.251904 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.450971 master-0 kubenswrapper[4813]: E1203 19:53:16.450875 4813 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Dec 03 19:53:16.520504 master-0 kubenswrapper[4813]: I1203 19:53:16.520409 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.520504 master-0 kubenswrapper[4813]: E1203 19:53:16.520455 4813 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 03 19:53:16.624237 master-0 kubenswrapper[4813]: I1203 19:53:16.624132 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.639431 master-0 kubenswrapper[4813]: I1203 19:53:16.639357 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.695762 master-0 kubenswrapper[4813]: I1203 19:53:16.695698 4813 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 19:53:16.710406 master-0 kubenswrapper[4813]: I1203 19:53:16.710284 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:16.711637 master-0 kubenswrapper[4813]: I1203 19:53:16.711592 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:16.711759 master-0 kubenswrapper[4813]: I1203 19:53:16.711665 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:16.711759 master-0 kubenswrapper[4813]: I1203 19:53:16.711695 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:16.711871 master-0 kubenswrapper[4813]: I1203 19:53:16.711767 4813 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:53:16.722050 master-0 kubenswrapper[4813]: I1203 19:53:16.721998 4813 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 19:53:16.722050 master-0 kubenswrapper[4813]: E1203 19:53:16.722046 4813 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 19:53:16.733966 master-0 kubenswrapper[4813]: E1203 19:53:16.733912 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:16.829531 master-0 kubenswrapper[4813]: I1203 19:53:16.829451 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Dec 03 19:53:16.834729 master-0 kubenswrapper[4813]: E1203 19:53:16.834663 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:16.840749 master-0 kubenswrapper[4813]: I1203 19:53:16.840698 4813 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 19:53:16.935275 master-0 kubenswrapper[4813]: E1203 19:53:16.935190 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.035537 master-0 kubenswrapper[4813]: E1203 19:53:17.035340 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.136345 master-0 kubenswrapper[4813]: E1203 19:53:17.136219 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.236895 master-0 kubenswrapper[4813]: E1203 19:53:17.236748 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.337918 master-0 kubenswrapper[4813]: E1203 19:53:17.337682 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.438932 master-0 kubenswrapper[4813]: E1203 19:53:17.438881 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.539856 master-0 kubenswrapper[4813]: E1203 19:53:17.539752 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.640478 master-0 kubenswrapper[4813]: E1203 19:53:17.640363 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.740591 master-0 kubenswrapper[4813]: E1203 19:53:17.740528 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.841629 master-0 kubenswrapper[4813]: E1203 19:53:17.841521 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:17.942637 master-0 kubenswrapper[4813]: E1203 19:53:17.942447 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.043464 master-0 kubenswrapper[4813]: E1203 19:53:18.043336 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.144807 master-0 kubenswrapper[4813]: E1203 19:53:18.144645 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.245101 master-0 kubenswrapper[4813]: E1203 19:53:18.244958 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.345319 master-0 kubenswrapper[4813]: E1203 19:53:18.345213 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.445499 master-0 kubenswrapper[4813]: E1203 19:53:18.445395 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.546674 master-0 kubenswrapper[4813]: E1203 19:53:18.546480 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.646890 master-0 kubenswrapper[4813]: E1203 19:53:18.646742 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.747516 master-0 kubenswrapper[4813]: E1203 19:53:18.747464 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.848728 master-0 kubenswrapper[4813]: E1203 19:53:18.848601 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:18.949756 master-0 kubenswrapper[4813]: E1203 19:53:18.949682 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.050829 master-0 kubenswrapper[4813]: E1203 19:53:19.050692 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.151671 master-0 kubenswrapper[4813]: E1203 19:53:19.151573 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.252404 master-0 kubenswrapper[4813]: E1203 19:53:19.252334 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.352620 master-0 kubenswrapper[4813]: E1203 19:53:19.352530 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.453176 master-0 kubenswrapper[4813]: E1203 19:53:19.453015 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.553714 master-0 kubenswrapper[4813]: E1203 19:53:19.553623 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.654451 master-0 kubenswrapper[4813]: E1203 19:53:19.654362 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.754882 master-0 kubenswrapper[4813]: E1203 19:53:19.754632 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.805324 master-0 kubenswrapper[4813]: I1203 19:53:19.805267 4813 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 19:53:19.854978 master-0 kubenswrapper[4813]: E1203 19:53:19.854921 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:19.955146 master-0 kubenswrapper[4813]: E1203 19:53:19.955035 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.056130 master-0 kubenswrapper[4813]: E1203 19:53:20.055944 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.156433 master-0 kubenswrapper[4813]: E1203 19:53:20.156329 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.257387 master-0 kubenswrapper[4813]: E1203 19:53:20.257292 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.357916 master-0 kubenswrapper[4813]: E1203 19:53:20.357615 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.458698 master-0 kubenswrapper[4813]: E1203 19:53:20.458564 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.559637 master-0 kubenswrapper[4813]: E1203 19:53:20.559530 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.659940 master-0 kubenswrapper[4813]: E1203 19:53:20.659852 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.760413 master-0 kubenswrapper[4813]: E1203 19:53:20.760302 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.860948 master-0 kubenswrapper[4813]: E1203 19:53:20.860842 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:20.962297 master-0 kubenswrapper[4813]: E1203 19:53:20.962106 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.062698 master-0 kubenswrapper[4813]: E1203 19:53:21.062563 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.162952 master-0 kubenswrapper[4813]: E1203 19:53:21.162874 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.264079 master-0 kubenswrapper[4813]: E1203 19:53:21.263867 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.364531 master-0 kubenswrapper[4813]: E1203 19:53:21.364471 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.465031 master-0 kubenswrapper[4813]: E1203 19:53:21.464958 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.565720 master-0 kubenswrapper[4813]: E1203 19:53:21.565556 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.666057 master-0 kubenswrapper[4813]: E1203 19:53:21.665925 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.767289 master-0 kubenswrapper[4813]: E1203 19:53:21.767168 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.868022 master-0 kubenswrapper[4813]: E1203 19:53:21.867921 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:21.969039 master-0 kubenswrapper[4813]: E1203 19:53:21.968918 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.069733 master-0 kubenswrapper[4813]: E1203 19:53:22.069643 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.170221 master-0 kubenswrapper[4813]: E1203 19:53:22.170022 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.270449 master-0 kubenswrapper[4813]: E1203 19:53:22.270343 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.371483 master-0 kubenswrapper[4813]: E1203 19:53:22.371380 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.472288 master-0 kubenswrapper[4813]: E1203 19:53:22.472106 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.572733 master-0 kubenswrapper[4813]: E1203 19:53:22.572631 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.672916 master-0 kubenswrapper[4813]: E1203 19:53:22.672840 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.773646 master-0 kubenswrapper[4813]: E1203 19:53:22.773443 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.874523 master-0 kubenswrapper[4813]: E1203 19:53:22.874385 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:22.974914 master-0 kubenswrapper[4813]: E1203 19:53:22.974815 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.075510 master-0 kubenswrapper[4813]: E1203 19:53:23.075291 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.176247 master-0 kubenswrapper[4813]: E1203 19:53:23.176144 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.276858 master-0 kubenswrapper[4813]: E1203 19:53:23.276662 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.377028 master-0 kubenswrapper[4813]: E1203 19:53:23.376884 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.477878 master-0 kubenswrapper[4813]: E1203 19:53:23.477772 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.578602 master-0 kubenswrapper[4813]: E1203 19:53:23.577974 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.679159 master-0 kubenswrapper[4813]: E1203 19:53:23.678877 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.779280 master-0 kubenswrapper[4813]: E1203 19:53:23.779178 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.880221 master-0 kubenswrapper[4813]: E1203 19:53:23.880090 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:23.981257 master-0 kubenswrapper[4813]: E1203 19:53:23.980998 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.082291 master-0 kubenswrapper[4813]: E1203 19:53:24.082154 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.183132 master-0 kubenswrapper[4813]: E1203 19:53:24.183022 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.283955 master-0 kubenswrapper[4813]: E1203 19:53:24.283626 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.384820 master-0 kubenswrapper[4813]: E1203 19:53:24.384707 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.485623 master-0 kubenswrapper[4813]: E1203 19:53:24.485504 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.586485 master-0 kubenswrapper[4813]: E1203 19:53:24.586293 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.687120 master-0 kubenswrapper[4813]: E1203 19:53:24.687007 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.787465 master-0 kubenswrapper[4813]: E1203 19:53:24.787383 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.888564 master-0 kubenswrapper[4813]: E1203 19:53:24.888471 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:24.990069 master-0 kubenswrapper[4813]: E1203 19:53:24.989294 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.023316 master-0 kubenswrapper[4813]: I1203 19:53:25.023203 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:25.024637 master-0 kubenswrapper[4813]: I1203 19:53:25.024566 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:25.024637 master-0 kubenswrapper[4813]: I1203 19:53:25.024620 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:25.024637 master-0 kubenswrapper[4813]: I1203 19:53:25.024637 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:25.025190 master-0 kubenswrapper[4813]: I1203 19:53:25.025141 4813 scope.go:117] "RemoveContainer" containerID="9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b" Dec 03 19:53:25.089952 master-0 kubenswrapper[4813]: E1203 19:53:25.089864 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.190972 master-0 kubenswrapper[4813]: E1203 19:53:25.190833 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.292114 master-0 kubenswrapper[4813]: E1203 19:53:25.292009 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.393205 master-0 kubenswrapper[4813]: E1203 19:53:25.393143 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.494466 master-0 kubenswrapper[4813]: E1203 19:53:25.494219 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.594424 master-0 kubenswrapper[4813]: E1203 19:53:25.594354 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.695651 master-0 kubenswrapper[4813]: E1203 19:53:25.695565 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.796388 master-0 kubenswrapper[4813]: E1203 19:53:25.796232 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.891983 master-0 kubenswrapper[4813]: I1203 19:53:25.891912 4813 csr.go:261] certificate signing request csr-4xc2z is approved, waiting to be issued Dec 03 19:53:25.897440 master-0 kubenswrapper[4813]: E1203 19:53:25.897380 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:25.905621 master-0 kubenswrapper[4813]: I1203 19:53:25.905280 4813 csr.go:257] certificate signing request csr-4xc2z is issued Dec 03 19:53:25.957165 master-0 kubenswrapper[4813]: E1203 19:53:25.957100 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:53:25.998056 master-0 kubenswrapper[4813]: E1203 19:53:25.997967 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.099163 master-0 kubenswrapper[4813]: E1203 19:53:26.099021 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.185305 master-0 kubenswrapper[4813]: I1203 19:53:26.185220 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/3.log" Dec 03 19:53:26.186037 master-0 kubenswrapper[4813]: I1203 19:53:26.185992 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 19:53:26.186573 master-0 kubenswrapper[4813]: I1203 19:53:26.186530 4813 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" exitCode=1 Dec 03 19:53:26.186668 master-0 kubenswrapper[4813]: I1203 19:53:26.186577 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385"} Dec 03 19:53:26.186668 master-0 kubenswrapper[4813]: I1203 19:53:26.186623 4813 scope.go:117] "RemoveContainer" containerID="9f73ecf27633bbc37b55a45d007054279e2297ba5fe10cc1fbc7253bd9c8db2b" Dec 03 19:53:26.186815 master-0 kubenswrapper[4813]: I1203 19:53:26.186742 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:26.187932 master-0 kubenswrapper[4813]: I1203 19:53:26.187892 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:26.187932 master-0 kubenswrapper[4813]: I1203 19:53:26.187927 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:26.188070 master-0 kubenswrapper[4813]: I1203 19:53:26.187942 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:26.188758 master-0 kubenswrapper[4813]: I1203 19:53:26.188287 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:53:26.188758 master-0 kubenswrapper[4813]: E1203 19:53:26.188506 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:26.199797 master-0 kubenswrapper[4813]: E1203 19:53:26.199729 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.300861 master-0 kubenswrapper[4813]: E1203 19:53:26.300786 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.401968 master-0 kubenswrapper[4813]: E1203 19:53:26.401861 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.502917 master-0 kubenswrapper[4813]: E1203 19:53:26.502848 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.604087 master-0 kubenswrapper[4813]: E1203 19:53:26.603982 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.704351 master-0 kubenswrapper[4813]: E1203 19:53:26.704121 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.805487 master-0 kubenswrapper[4813]: E1203 19:53:26.805373 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.815294 master-0 kubenswrapper[4813]: E1203 19:53:26.815244 4813 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 19:53:26.906471 master-0 kubenswrapper[4813]: E1203 19:53:26.906380 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:26.907597 master-0 kubenswrapper[4813]: I1203 19:53:26.907543 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 14:57:14.12972661 +0000 UTC Dec 03 19:53:26.907597 master-0 kubenswrapper[4813]: I1203 19:53:26.907587 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h3m47.222142514s for next certificate rotation Dec 03 19:53:27.007707 master-0 kubenswrapper[4813]: E1203 19:53:27.007507 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.107698 master-0 kubenswrapper[4813]: E1203 19:53:27.107628 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.191416 master-0 kubenswrapper[4813]: I1203 19:53:27.191351 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/3.log" Dec 03 19:53:27.208524 master-0 kubenswrapper[4813]: E1203 19:53:27.208448 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.308857 master-0 kubenswrapper[4813]: E1203 19:53:27.308648 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.409927 master-0 kubenswrapper[4813]: E1203 19:53:27.409836 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.510647 master-0 kubenswrapper[4813]: E1203 19:53:27.510533 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.611703 master-0 kubenswrapper[4813]: E1203 19:53:27.611653 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.711894 master-0 kubenswrapper[4813]: E1203 19:53:27.711838 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.813134 master-0 kubenswrapper[4813]: E1203 19:53:27.813042 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:27.908265 master-0 kubenswrapper[4813]: I1203 19:53:27.908066 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 14:52:34.611122008 +0000 UTC Dec 03 19:53:27.908265 master-0 kubenswrapper[4813]: I1203 19:53:27.908133 4813 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h59m6.702998636s for next certificate rotation Dec 03 19:53:27.913411 master-0 kubenswrapper[4813]: E1203 19:53:27.913360 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.014004 master-0 kubenswrapper[4813]: E1203 19:53:28.013948 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.114309 master-0 kubenswrapper[4813]: E1203 19:53:28.114262 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.214714 master-0 kubenswrapper[4813]: E1203 19:53:28.214592 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.315373 master-0 kubenswrapper[4813]: E1203 19:53:28.315292 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.416573 master-0 kubenswrapper[4813]: E1203 19:53:28.416472 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.516969 master-0 kubenswrapper[4813]: E1203 19:53:28.516629 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.617915 master-0 kubenswrapper[4813]: E1203 19:53:28.617774 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.718324 master-0 kubenswrapper[4813]: E1203 19:53:28.718187 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.819524 master-0 kubenswrapper[4813]: E1203 19:53:28.819352 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:28.920175 master-0 kubenswrapper[4813]: E1203 19:53:28.920088 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.021011 master-0 kubenswrapper[4813]: E1203 19:53:29.020927 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.121700 master-0 kubenswrapper[4813]: E1203 19:53:29.121618 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.221939 master-0 kubenswrapper[4813]: E1203 19:53:29.221854 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.323021 master-0 kubenswrapper[4813]: E1203 19:53:29.322915 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.423550 master-0 kubenswrapper[4813]: E1203 19:53:29.423378 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.523752 master-0 kubenswrapper[4813]: E1203 19:53:29.523646 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.624954 master-0 kubenswrapper[4813]: E1203 19:53:29.624840 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.725208 master-0 kubenswrapper[4813]: E1203 19:53:29.725003 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.826350 master-0 kubenswrapper[4813]: E1203 19:53:29.826236 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:29.926527 master-0 kubenswrapper[4813]: E1203 19:53:29.926375 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.027269 master-0 kubenswrapper[4813]: E1203 19:53:30.027048 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.127370 master-0 kubenswrapper[4813]: E1203 19:53:30.127267 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.227728 master-0 kubenswrapper[4813]: E1203 19:53:30.227645 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.328305 master-0 kubenswrapper[4813]: E1203 19:53:30.328117 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.428358 master-0 kubenswrapper[4813]: E1203 19:53:30.428260 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.529503 master-0 kubenswrapper[4813]: E1203 19:53:30.529394 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.629830 master-0 kubenswrapper[4813]: E1203 19:53:30.629709 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.730688 master-0 kubenswrapper[4813]: E1203 19:53:30.730598 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.831389 master-0 kubenswrapper[4813]: E1203 19:53:30.831267 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:30.932231 master-0 kubenswrapper[4813]: E1203 19:53:30.932054 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.033138 master-0 kubenswrapper[4813]: E1203 19:53:31.033027 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.133665 master-0 kubenswrapper[4813]: E1203 19:53:31.133572 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.234061 master-0 kubenswrapper[4813]: E1203 19:53:31.233866 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.334114 master-0 kubenswrapper[4813]: E1203 19:53:31.334013 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.435277 master-0 kubenswrapper[4813]: E1203 19:53:31.435174 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.536084 master-0 kubenswrapper[4813]: E1203 19:53:31.535879 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.636964 master-0 kubenswrapper[4813]: E1203 19:53:31.636859 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.737855 master-0 kubenswrapper[4813]: E1203 19:53:31.737702 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.838436 master-0 kubenswrapper[4813]: E1203 19:53:31.838297 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:31.938901 master-0 kubenswrapper[4813]: E1203 19:53:31.938768 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.039756 master-0 kubenswrapper[4813]: E1203 19:53:32.039674 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.140873 master-0 kubenswrapper[4813]: E1203 19:53:32.140834 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.241261 master-0 kubenswrapper[4813]: E1203 19:53:32.241151 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.341740 master-0 kubenswrapper[4813]: E1203 19:53:32.341633 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.442996 master-0 kubenswrapper[4813]: E1203 19:53:32.442847 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.543118 master-0 kubenswrapper[4813]: E1203 19:53:32.543034 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.644152 master-0 kubenswrapper[4813]: E1203 19:53:32.644094 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.745120 master-0 kubenswrapper[4813]: E1203 19:53:32.744941 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.845565 master-0 kubenswrapper[4813]: E1203 19:53:32.845457 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:32.946687 master-0 kubenswrapper[4813]: E1203 19:53:32.946554 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.047757 master-0 kubenswrapper[4813]: E1203 19:53:33.047574 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.148580 master-0 kubenswrapper[4813]: E1203 19:53:33.148516 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.248849 master-0 kubenswrapper[4813]: E1203 19:53:33.248704 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.349154 master-0 kubenswrapper[4813]: E1203 19:53:33.348969 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.450134 master-0 kubenswrapper[4813]: E1203 19:53:33.450040 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.550259 master-0 kubenswrapper[4813]: E1203 19:53:33.550162 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.651463 master-0 kubenswrapper[4813]: E1203 19:53:33.651331 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.752625 master-0 kubenswrapper[4813]: E1203 19:53:33.752503 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.852839 master-0 kubenswrapper[4813]: E1203 19:53:33.852704 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:33.953842 master-0 kubenswrapper[4813]: E1203 19:53:33.953623 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.054964 master-0 kubenswrapper[4813]: E1203 19:53:34.054865 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.155973 master-0 kubenswrapper[4813]: E1203 19:53:34.155881 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.256849 master-0 kubenswrapper[4813]: E1203 19:53:34.256630 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.357134 master-0 kubenswrapper[4813]: E1203 19:53:34.357063 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.458110 master-0 kubenswrapper[4813]: E1203 19:53:34.458005 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.559267 master-0 kubenswrapper[4813]: E1203 19:53:34.559121 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.659317 master-0 kubenswrapper[4813]: E1203 19:53:34.659226 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.760174 master-0 kubenswrapper[4813]: E1203 19:53:34.760114 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.860948 master-0 kubenswrapper[4813]: E1203 19:53:34.860899 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:34.961998 master-0 kubenswrapper[4813]: E1203 19:53:34.961887 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.062748 master-0 kubenswrapper[4813]: E1203 19:53:35.062651 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.163422 master-0 kubenswrapper[4813]: E1203 19:53:35.163232 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.264371 master-0 kubenswrapper[4813]: E1203 19:53:35.264269 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.364732 master-0 kubenswrapper[4813]: E1203 19:53:35.364670 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.465867 master-0 kubenswrapper[4813]: E1203 19:53:35.465696 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.566812 master-0 kubenswrapper[4813]: E1203 19:53:35.566726 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.668002 master-0 kubenswrapper[4813]: E1203 19:53:35.667874 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.769160 master-0 kubenswrapper[4813]: E1203 19:53:35.768961 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.870020 master-0 kubenswrapper[4813]: E1203 19:53:35.869942 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:35.958227 master-0 kubenswrapper[4813]: E1203 19:53:35.958123 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:53:35.970769 master-0 kubenswrapper[4813]: E1203 19:53:35.970687 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.071817 master-0 kubenswrapper[4813]: E1203 19:53:36.071567 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.172470 master-0 kubenswrapper[4813]: E1203 19:53:36.172367 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.273345 master-0 kubenswrapper[4813]: E1203 19:53:36.273234 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.374138 master-0 kubenswrapper[4813]: E1203 19:53:36.374000 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.475197 master-0 kubenswrapper[4813]: E1203 19:53:36.475071 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.576613 master-0 kubenswrapper[4813]: E1203 19:53:36.576257 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.677525 master-0 kubenswrapper[4813]: E1203 19:53:36.677337 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.777739 master-0 kubenswrapper[4813]: E1203 19:53:36.777626 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.878671 master-0 kubenswrapper[4813]: E1203 19:53:36.878539 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.979092 master-0 kubenswrapper[4813]: E1203 19:53:36.978904 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:36.983327 master-0 kubenswrapper[4813]: E1203 19:53:36.983279 4813 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 19:53:37.079452 master-0 kubenswrapper[4813]: E1203 19:53:37.079373 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.180276 master-0 kubenswrapper[4813]: E1203 19:53:37.180173 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.281158 master-0 kubenswrapper[4813]: E1203 19:53:37.280993 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.381300 master-0 kubenswrapper[4813]: E1203 19:53:37.381197 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.482515 master-0 kubenswrapper[4813]: E1203 19:53:37.482421 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.583598 master-0 kubenswrapper[4813]: E1203 19:53:37.583398 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.684570 master-0 kubenswrapper[4813]: E1203 19:53:37.684474 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.785809 master-0 kubenswrapper[4813]: E1203 19:53:37.785684 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.886703 master-0 kubenswrapper[4813]: E1203 19:53:37.886608 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:37.987332 master-0 kubenswrapper[4813]: E1203 19:53:37.987237 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.088192 master-0 kubenswrapper[4813]: E1203 19:53:38.088076 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.189381 master-0 kubenswrapper[4813]: E1203 19:53:38.189214 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.290313 master-0 kubenswrapper[4813]: E1203 19:53:38.290229 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.390520 master-0 kubenswrapper[4813]: E1203 19:53:38.390414 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.490876 master-0 kubenswrapper[4813]: E1203 19:53:38.490622 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.591648 master-0 kubenswrapper[4813]: E1203 19:53:38.591538 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.692553 master-0 kubenswrapper[4813]: E1203 19:53:38.692454 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.793125 master-0 kubenswrapper[4813]: E1203 19:53:38.792885 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.893621 master-0 kubenswrapper[4813]: E1203 19:53:38.893485 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:38.993822 master-0 kubenswrapper[4813]: E1203 19:53:38.993692 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.022868 master-0 kubenswrapper[4813]: I1203 19:53:39.022668 4813 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:53:39.024608 master-0 kubenswrapper[4813]: I1203 19:53:39.024353 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:53:39.024608 master-0 kubenswrapper[4813]: I1203 19:53:39.024400 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:53:39.024608 master-0 kubenswrapper[4813]: I1203 19:53:39.024415 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:53:39.026246 master-0 kubenswrapper[4813]: I1203 19:53:39.026039 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:53:39.026482 master-0 kubenswrapper[4813]: E1203 19:53:39.026274 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:39.094137 master-0 kubenswrapper[4813]: E1203 19:53:39.093881 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.195159 master-0 kubenswrapper[4813]: E1203 19:53:39.195071 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.296325 master-0 kubenswrapper[4813]: E1203 19:53:39.296203 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.396613 master-0 kubenswrapper[4813]: E1203 19:53:39.396505 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.497222 master-0 kubenswrapper[4813]: E1203 19:53:39.497049 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.597523 master-0 kubenswrapper[4813]: E1203 19:53:39.597371 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.698086 master-0 kubenswrapper[4813]: E1203 19:53:39.697868 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.798951 master-0 kubenswrapper[4813]: E1203 19:53:39.798865 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:39.899552 master-0 kubenswrapper[4813]: E1203 19:53:39.899451 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.000135 master-0 kubenswrapper[4813]: E1203 19:53:39.999984 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.101141 master-0 kubenswrapper[4813]: E1203 19:53:40.101025 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.201945 master-0 kubenswrapper[4813]: E1203 19:53:40.201839 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.302984 master-0 kubenswrapper[4813]: E1203 19:53:40.302766 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.403987 master-0 kubenswrapper[4813]: E1203 19:53:40.403898 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.504423 master-0 kubenswrapper[4813]: E1203 19:53:40.504299 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.605462 master-0 kubenswrapper[4813]: E1203 19:53:40.605256 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.705612 master-0 kubenswrapper[4813]: E1203 19:53:40.705506 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.806155 master-0 kubenswrapper[4813]: E1203 19:53:40.806040 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:40.906708 master-0 kubenswrapper[4813]: E1203 19:53:40.906603 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.007456 master-0 kubenswrapper[4813]: E1203 19:53:41.007362 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.107718 master-0 kubenswrapper[4813]: E1203 19:53:41.107624 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.208116 master-0 kubenswrapper[4813]: E1203 19:53:41.207932 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.308648 master-0 kubenswrapper[4813]: E1203 19:53:41.308547 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.409514 master-0 kubenswrapper[4813]: E1203 19:53:41.409386 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.510469 master-0 kubenswrapper[4813]: E1203 19:53:41.510294 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.611605 master-0 kubenswrapper[4813]: E1203 19:53:41.611494 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.712453 master-0 kubenswrapper[4813]: E1203 19:53:41.712360 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.813190 master-0 kubenswrapper[4813]: E1203 19:53:41.813002 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:41.913504 master-0 kubenswrapper[4813]: E1203 19:53:41.913413 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.013906 master-0 kubenswrapper[4813]: E1203 19:53:42.013817 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.114497 master-0 kubenswrapper[4813]: E1203 19:53:42.114439 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.214987 master-0 kubenswrapper[4813]: E1203 19:53:42.214894 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.315826 master-0 kubenswrapper[4813]: E1203 19:53:42.315618 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.416986 master-0 kubenswrapper[4813]: E1203 19:53:42.416767 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.517828 master-0 kubenswrapper[4813]: E1203 19:53:42.517603 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.618826 master-0 kubenswrapper[4813]: E1203 19:53:42.618697 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.720094 master-0 kubenswrapper[4813]: E1203 19:53:42.719884 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.821445 master-0 kubenswrapper[4813]: E1203 19:53:42.821104 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:42.921824 master-0 kubenswrapper[4813]: E1203 19:53:42.921711 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.022324 master-0 kubenswrapper[4813]: E1203 19:53:43.022127 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.122467 master-0 kubenswrapper[4813]: E1203 19:53:43.122401 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.223615 master-0 kubenswrapper[4813]: E1203 19:53:43.223503 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.323882 master-0 kubenswrapper[4813]: E1203 19:53:43.323684 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.424621 master-0 kubenswrapper[4813]: E1203 19:53:43.424508 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.525565 master-0 kubenswrapper[4813]: E1203 19:53:43.525488 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.625736 master-0 kubenswrapper[4813]: E1203 19:53:43.625638 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.725968 master-0 kubenswrapper[4813]: E1203 19:53:43.725874 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.827041 master-0 kubenswrapper[4813]: E1203 19:53:43.826953 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:43.927435 master-0 kubenswrapper[4813]: E1203 19:53:43.927278 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.027983 master-0 kubenswrapper[4813]: E1203 19:53:44.027875 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.128962 master-0 kubenswrapper[4813]: E1203 19:53:44.128847 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.230302 master-0 kubenswrapper[4813]: E1203 19:53:44.230065 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.330240 master-0 kubenswrapper[4813]: E1203 19:53:44.330170 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.430613 master-0 kubenswrapper[4813]: E1203 19:53:44.430549 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.531578 master-0 kubenswrapper[4813]: E1203 19:53:44.531432 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.549009 master-0 kubenswrapper[4813]: I1203 19:53:44.548896 4813 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 19:53:44.631973 master-0 kubenswrapper[4813]: E1203 19:53:44.631863 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.732813 master-0 kubenswrapper[4813]: E1203 19:53:44.732651 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.833342 master-0 kubenswrapper[4813]: E1203 19:53:44.833141 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:44.933979 master-0 kubenswrapper[4813]: E1203 19:53:44.933864 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.034758 master-0 kubenswrapper[4813]: E1203 19:53:45.034662 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.134890 master-0 kubenswrapper[4813]: E1203 19:53:45.134758 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.235429 master-0 kubenswrapper[4813]: E1203 19:53:45.235319 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.336571 master-0 kubenswrapper[4813]: E1203 19:53:45.336428 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.437718 master-0 kubenswrapper[4813]: E1203 19:53:45.437517 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.538577 master-0 kubenswrapper[4813]: E1203 19:53:45.538485 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.639404 master-0 kubenswrapper[4813]: E1203 19:53:45.639334 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.739680 master-0 kubenswrapper[4813]: E1203 19:53:45.739445 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.840239 master-0 kubenswrapper[4813]: E1203 19:53:45.840132 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.940472 master-0 kubenswrapper[4813]: E1203 19:53:45.940320 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:45.958760 master-0 kubenswrapper[4813]: E1203 19:53:45.958672 4813 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 19:53:46.041601 master-0 kubenswrapper[4813]: E1203 19:53:46.041399 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.142300 master-0 kubenswrapper[4813]: E1203 19:53:46.142191 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.242363 master-0 kubenswrapper[4813]: E1203 19:53:46.242281 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.343147 master-0 kubenswrapper[4813]: E1203 19:53:46.342861 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.444076 master-0 kubenswrapper[4813]: E1203 19:53:46.443983 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.544816 master-0 kubenswrapper[4813]: E1203 19:53:46.544722 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.645768 master-0 kubenswrapper[4813]: E1203 19:53:46.645709 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.746148 master-0 kubenswrapper[4813]: E1203 19:53:46.746100 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.846370 master-0 kubenswrapper[4813]: E1203 19:53:46.846290 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:46.946991 master-0 kubenswrapper[4813]: E1203 19:53:46.946837 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.047612 master-0 kubenswrapper[4813]: E1203 19:53:47.047530 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.132597 master-0 kubenswrapper[4813]: E1203 19:53:47.132528 4813 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 19:53:47.148008 master-0 kubenswrapper[4813]: E1203 19:53:47.147966 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.248234 master-0 kubenswrapper[4813]: E1203 19:53:47.248070 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.349253 master-0 kubenswrapper[4813]: E1203 19:53:47.349189 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.449880 master-0 kubenswrapper[4813]: E1203 19:53:47.449729 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.550461 master-0 kubenswrapper[4813]: E1203 19:53:47.550253 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.651489 master-0 kubenswrapper[4813]: E1203 19:53:47.651365 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.751638 master-0 kubenswrapper[4813]: E1203 19:53:47.751530 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.852288 master-0 kubenswrapper[4813]: E1203 19:53:47.852085 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:47.953287 master-0 kubenswrapper[4813]: E1203 19:53:47.953134 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.053948 master-0 kubenswrapper[4813]: E1203 19:53:48.053867 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.154821 master-0 kubenswrapper[4813]: E1203 19:53:48.154738 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.255009 master-0 kubenswrapper[4813]: E1203 19:53:48.254912 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.355159 master-0 kubenswrapper[4813]: E1203 19:53:48.355066 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.455386 master-0 kubenswrapper[4813]: E1203 19:53:48.455205 4813 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 19:53:48.480129 master-0 kubenswrapper[4813]: I1203 19:53:48.480075 4813 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 19:53:48.841438 master-0 kubenswrapper[4813]: I1203 19:53:48.841185 4813 apiserver.go:52] "Watching apiserver" Dec 03 19:53:48.844647 master-0 kubenswrapper[4813]: I1203 19:53:48.844592 4813 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 19:53:48.844897 master-0 kubenswrapper[4813]: I1203 19:53:48.844848 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-6cbf58c977-w7d8t","assisted-installer/assisted-installer-controller-ljsns","openshift-cluster-version/cluster-version-operator-869c786959-zbl42"] Dec 03 19:53:48.845317 master-0 kubenswrapper[4813]: I1203 19:53:48.845262 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:48.845317 master-0 kubenswrapper[4813]: I1203 19:53:48.845289 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.845510 master-0 kubenswrapper[4813]: I1203 19:53:48.845423 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:48.847768 master-0 kubenswrapper[4813]: I1203 19:53:48.847675 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 19:53:48.847768 master-0 kubenswrapper[4813]: I1203 19:53:48.847717 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Dec 03 19:53:48.848399 master-0 kubenswrapper[4813]: I1203 19:53:48.848235 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 19:53:48.848609 master-0 kubenswrapper[4813]: I1203 19:53:48.848561 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 19:53:48.849265 master-0 kubenswrapper[4813]: I1203 19:53:48.849097 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Dec 03 19:53:48.849265 master-0 kubenswrapper[4813]: I1203 19:53:48.849181 4813 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Dec 03 19:53:48.850553 master-0 kubenswrapper[4813]: I1203 19:53:48.850120 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 19:53:48.850553 master-0 kubenswrapper[4813]: I1203 19:53:48.850224 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Dec 03 19:53:48.850553 master-0 kubenswrapper[4813]: I1203 19:53:48.850412 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 19:53:48.850553 master-0 kubenswrapper[4813]: I1203 19:53:48.850523 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 19:53:48.907475 master-0 kubenswrapper[4813]: I1203 19:53:48.907408 4813 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 19:53:48.926145 master-0 kubenswrapper[4813]: I1203 19:53:48.926014 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:48.926145 master-0 kubenswrapper[4813]: I1203 19:53:48.926100 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:48.926145 master-0 kubenswrapper[4813]: I1203 19:53:48.926133 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bh5s\" (UniqueName: \"kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.926145 master-0 kubenswrapper[4813]: I1203 19:53:48.926157 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926179 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926202 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926226 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926252 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926275 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926306 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926401 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926545 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:48.926516 master-0 kubenswrapper[4813]: I1203 19:53:48.926604 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.027887 master-0 kubenswrapper[4813]: I1203 19:53:49.027728 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.027887 master-0 kubenswrapper[4813]: I1203 19:53:49.027837 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.027887 master-0 kubenswrapper[4813]: I1203 19:53:49.027869 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bh5s\" (UniqueName: \"kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.027887 master-0 kubenswrapper[4813]: I1203 19:53:49.027898 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.027923 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.027943 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.027937 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.027965 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028085 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028131 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028166 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028203 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028209 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028242 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028270 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.028347 master-0 kubenswrapper[4813]: I1203 19:53:49.028284 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: I1203 19:53:49.028392 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: I1203 19:53:49.028453 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: I1203 19:53:49.028491 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: I1203 19:53:49.028494 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: E1203 19:53:49.028577 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:49.029195 master-0 kubenswrapper[4813]: E1203 19:53:49.028666 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:53:49.528636172 +0000 UTC m=+73.957434621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:49.030712 master-0 kubenswrapper[4813]: I1203 19:53:49.030591 4813 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 19:53:49.031668 master-0 kubenswrapper[4813]: I1203 19:53:49.031558 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.040326 master-0 kubenswrapper[4813]: I1203 19:53:49.040244 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.057546 master-0 kubenswrapper[4813]: I1203 19:53:49.057475 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bh5s\" (UniqueName: \"kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s\") pod \"assisted-installer-controller-ljsns\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.058143 master-0 kubenswrapper[4813]: I1203 19:53:49.058069 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.064671 master-0 kubenswrapper[4813]: I1203 19:53:49.064607 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.178560 master-0 kubenswrapper[4813]: I1203 19:53:49.178493 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:49.184847 master-0 kubenswrapper[4813]: I1203 19:53:49.184802 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:53:49.204185 master-0 kubenswrapper[4813]: W1203 19:53:49.203687 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6e1832_278b_4e37_b92b_2584e2daa34c.slice/crio-e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0 WatchSource:0}: Error finding container e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0: Status 404 returned error can't find the container with id e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0 Dec 03 19:53:49.207289 master-0 kubenswrapper[4813]: W1203 19:53:49.207193 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb4700c_6af0_468b_afc8_1e09b902d6bf.slice/crio-fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316 WatchSource:0}: Error finding container fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316: Status 404 returned error can't find the container with id fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316 Dec 03 19:53:49.244394 master-0 kubenswrapper[4813]: I1203 19:53:49.244275 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316"} Dec 03 19:53:49.245477 master-0 kubenswrapper[4813]: I1203 19:53:49.245410 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-ljsns" event={"ID":"0b6e1832-278b-4e37-b92b-2584e2daa34c","Type":"ContainerStarted","Data":"e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0"} Dec 03 19:53:49.532698 master-0 kubenswrapper[4813]: I1203 19:53:49.532479 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:49.533542 master-0 kubenswrapper[4813]: E1203 19:53:49.532681 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:49.533542 master-0 kubenswrapper[4813]: E1203 19:53:49.532854 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:53:50.532824689 +0000 UTC m=+74.961623148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:49.690198 master-0 kubenswrapper[4813]: I1203 19:53:49.690109 4813 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 19:53:50.542019 master-0 kubenswrapper[4813]: I1203 19:53:50.541948 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:50.542485 master-0 kubenswrapper[4813]: E1203 19:53:50.542115 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:50.542485 master-0 kubenswrapper[4813]: E1203 19:53:50.542206 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:53:52.542181037 +0000 UTC m=+76.970979496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:52.037840 master-0 kubenswrapper[4813]: I1203 19:53:52.037796 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Dec 03 19:53:52.038684 master-0 kubenswrapper[4813]: I1203 19:53:52.037999 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:53:52.038684 master-0 kubenswrapper[4813]: E1203 19:53:52.038269 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:52.252042 master-0 kubenswrapper[4813]: I1203 19:53:52.251977 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:53:52.252227 master-0 kubenswrapper[4813]: E1203 19:53:52.252166 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:53:52.555337 master-0 kubenswrapper[4813]: I1203 19:53:52.555242 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:52.555541 master-0 kubenswrapper[4813]: E1203 19:53:52.555487 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:52.555657 master-0 kubenswrapper[4813]: E1203 19:53:52.555623 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:53:56.555595315 +0000 UTC m=+80.984393804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:54.258363 master-0 kubenswrapper[4813]: I1203 19:53:54.258290 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"5b669ed74eaf8bfa020c73b3caed3c1731e9f130494d0a6716eecb9c6dd302d9"} Dec 03 19:53:54.260465 master-0 kubenswrapper[4813]: I1203 19:53:54.260420 4813 generic.go:334] "Generic (PLEG): container finished" podID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerID="5340fe194bb64dbc3aba205027b00290cb2a1905847a3d137e4cd0dbb4900723" exitCode=0 Dec 03 19:53:54.260552 master-0 kubenswrapper[4813]: I1203 19:53:54.260467 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-ljsns" event={"ID":"0b6e1832-278b-4e37-b92b-2584e2daa34c","Type":"ContainerDied","Data":"5340fe194bb64dbc3aba205027b00290cb2a1905847a3d137e4cd0dbb4900723"} Dec 03 19:53:54.291811 master-0 kubenswrapper[4813]: I1203 19:53:54.291696 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podStartSLOduration=32.689262484 podStartE2EDuration="37.291673572s" podCreationTimestamp="2025-12-03 19:53:17 +0000 UTC" firstStartedPulling="2025-12-03 19:53:49.209723445 +0000 UTC m=+73.638521924" lastFinishedPulling="2025-12-03 19:53:53.812134563 +0000 UTC m=+78.240933012" observedRunningTime="2025-12-03 19:53:54.278273271 +0000 UTC m=+78.707071730" watchObservedRunningTime="2025-12-03 19:53:54.291673572 +0000 UTC m=+78.720472031" Dec 03 19:53:55.290125 master-0 kubenswrapper[4813]: I1203 19:53:55.290087 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:55.376968 master-0 kubenswrapper[4813]: I1203 19:53:55.376877 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files\") pod \"0b6e1832-278b-4e37-b92b-2584e2daa34c\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " Dec 03 19:53:55.376968 master-0 kubenswrapper[4813]: I1203 19:53:55.376930 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bh5s\" (UniqueName: \"kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s\") pod \"0b6e1832-278b-4e37-b92b-2584e2daa34c\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " Dec 03 19:53:55.376968 master-0 kubenswrapper[4813]: I1203 19:53:55.376948 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle\") pod \"0b6e1832-278b-4e37-b92b-2584e2daa34c\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " Dec 03 19:53:55.376968 master-0 kubenswrapper[4813]: I1203 19:53:55.376964 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf\") pod \"0b6e1832-278b-4e37-b92b-2584e2daa34c\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " Dec 03 19:53:55.376968 master-0 kubenswrapper[4813]: I1203 19:53:55.376983 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf\") pod \"0b6e1832-278b-4e37-b92b-2584e2daa34c\" (UID: \"0b6e1832-278b-4e37-b92b-2584e2daa34c\") " Dec 03 19:53:55.377467 master-0 kubenswrapper[4813]: I1203 19:53:55.377019 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "0b6e1832-278b-4e37-b92b-2584e2daa34c" (UID: "0b6e1832-278b-4e37-b92b-2584e2daa34c"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:53:55.377467 master-0 kubenswrapper[4813]: I1203 19:53:55.377076 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "0b6e1832-278b-4e37-b92b-2584e2daa34c" (UID: "0b6e1832-278b-4e37-b92b-2584e2daa34c"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:53:55.377467 master-0 kubenswrapper[4813]: I1203 19:53:55.377093 4813 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:55.377467 master-0 kubenswrapper[4813]: I1203 19:53:55.377128 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "0b6e1832-278b-4e37-b92b-2584e2daa34c" (UID: "0b6e1832-278b-4e37-b92b-2584e2daa34c"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:53:55.377467 master-0 kubenswrapper[4813]: I1203 19:53:55.377215 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "0b6e1832-278b-4e37-b92b-2584e2daa34c" (UID: "0b6e1832-278b-4e37-b92b-2584e2daa34c"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:53:55.384172 master-0 kubenswrapper[4813]: I1203 19:53:55.384080 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s" (OuterVolumeSpecName: "kube-api-access-9bh5s") pod "0b6e1832-278b-4e37-b92b-2584e2daa34c" (UID: "0b6e1832-278b-4e37-b92b-2584e2daa34c"). InnerVolumeSpecName "kube-api-access-9bh5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:53:55.478325 master-0 kubenswrapper[4813]: I1203 19:53:55.478242 4813 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bh5s\" (UniqueName: \"kubernetes.io/projected/0b6e1832-278b-4e37-b92b-2584e2daa34c-kube-api-access-9bh5s\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:55.478325 master-0 kubenswrapper[4813]: I1203 19:53:55.478297 4813 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:55.478325 master-0 kubenswrapper[4813]: I1203 19:53:55.478315 4813 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:55.478325 master-0 kubenswrapper[4813]: I1203 19:53:55.478333 4813 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0b6e1832-278b-4e37-b92b-2584e2daa34c-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:56.268007 master-0 kubenswrapper[4813]: I1203 19:53:56.267916 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-ljsns" event={"ID":"0b6e1832-278b-4e37-b92b-2584e2daa34c","Type":"ContainerDied","Data":"e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0"} Dec 03 19:53:56.268007 master-0 kubenswrapper[4813]: I1203 19:53:56.267959 4813 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0" Dec 03 19:53:56.268007 master-0 kubenswrapper[4813]: I1203 19:53:56.267993 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:53:56.581491 master-0 kubenswrapper[4813]: I1203 19:53:56.581398 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-ldzw6"] Dec 03 19:53:56.582558 master-0 kubenswrapper[4813]: E1203 19:53:56.581521 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:53:56.582558 master-0 kubenswrapper[4813]: I1203 19:53:56.581535 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:53:56.582558 master-0 kubenswrapper[4813]: I1203 19:53:56.581807 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:53:56.582558 master-0 kubenswrapper[4813]: I1203 19:53:56.582042 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:56.587089 master-0 kubenswrapper[4813]: I1203 19:53:56.587010 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:53:56.587255 master-0 kubenswrapper[4813]: E1203 19:53:56.587213 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:56.587370 master-0 kubenswrapper[4813]: E1203 19:53:56.587303 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:54:04.587278947 +0000 UTC m=+89.016077436 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:53:56.687822 master-0 kubenswrapper[4813]: I1203 19:53:56.687699 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m746\" (UniqueName: \"kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746\") pod \"mtu-prober-ldzw6\" (UID: \"74ccc53d-803e-4d7d-a9b0-6cd604e7907a\") " pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:56.788624 master-0 kubenswrapper[4813]: I1203 19:53:56.788491 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m746\" (UniqueName: \"kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746\") pod \"mtu-prober-ldzw6\" (UID: \"74ccc53d-803e-4d7d-a9b0-6cd604e7907a\") " pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:56.819397 master-0 kubenswrapper[4813]: I1203 19:53:56.819328 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m746\" (UniqueName: \"kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746\") pod \"mtu-prober-ldzw6\" (UID: \"74ccc53d-803e-4d7d-a9b0-6cd604e7907a\") " pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:56.904381 master-0 kubenswrapper[4813]: I1203 19:53:56.904306 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:56.914130 master-0 kubenswrapper[4813]: W1203 19:53:56.914071 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ccc53d_803e_4d7d_a9b0_6cd604e7907a.slice/crio-205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c WatchSource:0}: Error finding container 205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c: Status 404 returned error can't find the container with id 205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c Dec 03 19:53:57.272289 master-0 kubenswrapper[4813]: I1203 19:53:57.272232 4813 generic.go:334] "Generic (PLEG): container finished" podID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerID="8bcfa4660c84f8191cb52e8becfb5db2481eb6ba813d896bb3f747ba456753f9" exitCode=0 Dec 03 19:53:57.272289 master-0 kubenswrapper[4813]: I1203 19:53:57.272270 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ldzw6" event={"ID":"74ccc53d-803e-4d7d-a9b0-6cd604e7907a","Type":"ContainerDied","Data":"8bcfa4660c84f8191cb52e8becfb5db2481eb6ba813d896bb3f747ba456753f9"} Dec 03 19:53:57.272289 master-0 kubenswrapper[4813]: I1203 19:53:57.272293 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ldzw6" event={"ID":"74ccc53d-803e-4d7d-a9b0-6cd604e7907a","Type":"ContainerStarted","Data":"205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c"} Dec 03 19:53:58.303175 master-0 kubenswrapper[4813]: I1203 19:53:58.303114 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:58.399894 master-0 kubenswrapper[4813]: I1203 19:53:58.399764 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m746\" (UniqueName: \"kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746\") pod \"74ccc53d-803e-4d7d-a9b0-6cd604e7907a\" (UID: \"74ccc53d-803e-4d7d-a9b0-6cd604e7907a\") " Dec 03 19:53:58.404744 master-0 kubenswrapper[4813]: I1203 19:53:58.404645 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746" (OuterVolumeSpecName: "kube-api-access-9m746") pod "74ccc53d-803e-4d7d-a9b0-6cd604e7907a" (UID: "74ccc53d-803e-4d7d-a9b0-6cd604e7907a"). InnerVolumeSpecName "kube-api-access-9m746". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:53:58.500808 master-0 kubenswrapper[4813]: I1203 19:53:58.500712 4813 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m746\" (UniqueName: \"kubernetes.io/projected/74ccc53d-803e-4d7d-a9b0-6cd604e7907a-kube-api-access-9m746\") on node \"master-0\" DevicePath \"\"" Dec 03 19:53:59.280539 master-0 kubenswrapper[4813]: I1203 19:53:59.280414 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ldzw6" event={"ID":"74ccc53d-803e-4d7d-a9b0-6cd604e7907a","Type":"ContainerDied","Data":"205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c"} Dec 03 19:53:59.280539 master-0 kubenswrapper[4813]: I1203 19:53:59.280513 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ldzw6" Dec 03 19:53:59.280926 master-0 kubenswrapper[4813]: I1203 19:53:59.280518 4813 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c" Dec 03 19:54:03.276101 master-0 kubenswrapper[4813]: I1203 19:54:03.276037 4813 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-ldzw6"] Dec 03 19:54:03.280548 master-0 kubenswrapper[4813]: I1203 19:54:03.280479 4813 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-ldzw6"] Dec 03 19:54:04.023637 master-0 kubenswrapper[4813]: I1203 19:54:04.023525 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:54:04.023953 master-0 kubenswrapper[4813]: E1203 19:54:04.023905 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 19:54:04.030629 master-0 kubenswrapper[4813]: I1203 19:54:04.030566 4813 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" path="/var/lib/kubelet/pods/74ccc53d-803e-4d7d-a9b0-6cd604e7907a/volumes" Dec 03 19:54:04.649001 master-0 kubenswrapper[4813]: I1203 19:54:04.648914 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:54:04.649758 master-0 kubenswrapper[4813]: E1203 19:54:04.649130 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:04.649758 master-0 kubenswrapper[4813]: E1203 19:54:04.649251 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:54:20.649220989 +0000 UTC m=+105.078019468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:06.037720 master-0 kubenswrapper[4813]: W1203 19:54:06.037640 4813 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 03 19:54:06.038528 master-0 kubenswrapper[4813]: I1203 19:54:06.038258 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 19:54:08.034353 master-0 kubenswrapper[4813]: I1203 19:54:08.034261 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-p9sdj"] Dec 03 19:54:08.034961 master-0 kubenswrapper[4813]: E1203 19:54:08.034399 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:54:08.034961 master-0 kubenswrapper[4813]: I1203 19:54:08.034427 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:54:08.034961 master-0 kubenswrapper[4813]: I1203 19:54:08.034479 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:54:08.034961 master-0 kubenswrapper[4813]: I1203 19:54:08.034844 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.037044 master-0 kubenswrapper[4813]: I1203 19:54:08.036983 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 19:54:08.038060 master-0 kubenswrapper[4813]: I1203 19:54:08.038001 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 19:54:08.038060 master-0 kubenswrapper[4813]: I1203 19:54:08.038015 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 19:54:08.038223 master-0 kubenswrapper[4813]: I1203 19:54:08.038031 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 19:54:08.074126 master-0 kubenswrapper[4813]: I1203 19:54:08.074026 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=2.07400574 podStartE2EDuration="2.07400574s" podCreationTimestamp="2025-12-03 19:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:54:08.052416725 +0000 UTC m=+92.481215184" watchObservedRunningTime="2025-12-03 19:54:08.07400574 +0000 UTC m=+92.502804209" Dec 03 19:54:08.076038 master-0 kubenswrapper[4813]: I1203 19:54:08.075994 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076101 master-0 kubenswrapper[4813]: I1203 19:54:08.076055 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076150 master-0 kubenswrapper[4813]: I1203 19:54:08.076097 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076150 master-0 kubenswrapper[4813]: I1203 19:54:08.076119 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076150 master-0 kubenswrapper[4813]: I1203 19:54:08.076140 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076257 master-0 kubenswrapper[4813]: I1203 19:54:08.076160 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076257 master-0 kubenswrapper[4813]: I1203 19:54:08.076211 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076257 master-0 kubenswrapper[4813]: I1203 19:54:08.076235 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076368 master-0 kubenswrapper[4813]: I1203 19:54:08.076257 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076368 master-0 kubenswrapper[4813]: I1203 19:54:08.076277 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076368 master-0 kubenswrapper[4813]: I1203 19:54:08.076302 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076368 master-0 kubenswrapper[4813]: I1203 19:54:08.076322 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076368 master-0 kubenswrapper[4813]: I1203 19:54:08.076363 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076540 master-0 kubenswrapper[4813]: I1203 19:54:08.076382 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076540 master-0 kubenswrapper[4813]: I1203 19:54:08.076402 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076540 master-0 kubenswrapper[4813]: I1203 19:54:08.076424 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.076540 master-0 kubenswrapper[4813]: I1203 19:54:08.076447 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177360 master-0 kubenswrapper[4813]: I1203 19:54:08.177258 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177360 master-0 kubenswrapper[4813]: I1203 19:54:08.177312 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177360 master-0 kubenswrapper[4813]: I1203 19:54:08.177335 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177360 master-0 kubenswrapper[4813]: I1203 19:54:08.177356 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177399 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177422 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177446 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177466 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177485 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177507 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177533 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177555 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177574 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177596 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177617 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177601 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177678 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177700 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177677 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177637 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177766 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.177928 master-0 kubenswrapper[4813]: I1203 19:54:08.177845 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177821 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177868 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177769 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177889 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177928 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177922 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177876 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177973 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.177974 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.179252 master-0 kubenswrapper[4813]: I1203 19:54:08.178900 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.180043 master-0 kubenswrapper[4813]: I1203 19:54:08.179312 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.199333 master-0 kubenswrapper[4813]: I1203 19:54:08.199262 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.202145 master-0 kubenswrapper[4813]: I1203 19:54:08.202099 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-pwlw2"] Dec 03 19:54:08.202931 master-0 kubenswrapper[4813]: I1203 19:54:08.202894 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.207317 master-0 kubenswrapper[4813]: I1203 19:54:08.207264 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 19:54:08.207527 master-0 kubenswrapper[4813]: I1203 19:54:08.207264 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 19:54:08.278894 master-0 kubenswrapper[4813]: I1203 19:54:08.278818 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.278894 master-0 kubenswrapper[4813]: I1203 19:54:08.278865 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.278894 master-0 kubenswrapper[4813]: I1203 19:54:08.278887 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.278894 master-0 kubenswrapper[4813]: I1203 19:54:08.278907 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.279210 master-0 kubenswrapper[4813]: I1203 19:54:08.278923 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.279210 master-0 kubenswrapper[4813]: I1203 19:54:08.279073 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.279210 master-0 kubenswrapper[4813]: I1203 19:54:08.279126 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.279210 master-0 kubenswrapper[4813]: I1203 19:54:08.279171 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.348028 master-0 kubenswrapper[4813]: I1203 19:54:08.347866 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p9sdj" Dec 03 19:54:08.360459 master-0 kubenswrapper[4813]: W1203 19:54:08.360378 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d3cd3e_98b8_4a41_a6bc_8837332fb6a6.slice/crio-d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4 WatchSource:0}: Error finding container d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4: Status 404 returned error can't find the container with id d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4 Dec 03 19:54:08.379898 master-0 kubenswrapper[4813]: I1203 19:54:08.379485 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380036 master-0 kubenswrapper[4813]: I1203 19:54:08.379911 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380036 master-0 kubenswrapper[4813]: I1203 19:54:08.379637 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380036 master-0 kubenswrapper[4813]: I1203 19:54:08.379957 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380036 master-0 kubenswrapper[4813]: I1203 19:54:08.379993 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380036 master-0 kubenswrapper[4813]: I1203 19:54:08.380031 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380358 master-0 kubenswrapper[4813]: I1203 19:54:08.380096 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380358 master-0 kubenswrapper[4813]: I1203 19:54:08.380121 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380358 master-0 kubenswrapper[4813]: I1203 19:54:08.380130 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380358 master-0 kubenswrapper[4813]: I1203 19:54:08.380206 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380358 master-0 kubenswrapper[4813]: I1203 19:54:08.380271 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.380621 master-0 kubenswrapper[4813]: I1203 19:54:08.380385 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.381246 master-0 kubenswrapper[4813]: I1203 19:54:08.381181 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.381630 master-0 kubenswrapper[4813]: I1203 19:54:08.381578 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.381927 master-0 kubenswrapper[4813]: I1203 19:54:08.381877 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.408439 master-0 kubenswrapper[4813]: I1203 19:54:08.408377 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.533585 master-0 kubenswrapper[4813]: I1203 19:54:08.533497 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:54:08.550560 master-0 kubenswrapper[4813]: W1203 19:54:08.550480 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f1759a_7df4_442e_a22d_6de8d54be333.slice/crio-048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252 WatchSource:0}: Error finding container 048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252: Status 404 returned error can't find the container with id 048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252 Dec 03 19:54:08.986209 master-0 kubenswrapper[4813]: I1203 19:54:08.986115 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hs6gf"] Dec 03 19:54:08.986637 master-0 kubenswrapper[4813]: I1203 19:54:08.986586 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:08.986748 master-0 kubenswrapper[4813]: E1203 19:54:08.986694 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:09.086760 master-0 kubenswrapper[4813]: I1203 19:54:09.086686 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.087299 master-0 kubenswrapper[4813]: I1203 19:54:09.086773 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.187915 master-0 kubenswrapper[4813]: I1203 19:54:09.187831 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.188176 master-0 kubenswrapper[4813]: I1203 19:54:09.187928 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.188176 master-0 kubenswrapper[4813]: E1203 19:54:09.188052 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:09.188176 master-0 kubenswrapper[4813]: E1203 19:54:09.188124 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:09.688104878 +0000 UTC m=+94.116903337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:09.216434 master-0 kubenswrapper[4813]: I1203 19:54:09.216366 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.303969 master-0 kubenswrapper[4813]: I1203 19:54:09.303844 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerStarted","Data":"048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252"} Dec 03 19:54:09.305214 master-0 kubenswrapper[4813]: I1203 19:54:09.305158 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9sdj" event={"ID":"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6","Type":"ContainerStarted","Data":"d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4"} Dec 03 19:54:09.692584 master-0 kubenswrapper[4813]: I1203 19:54:09.692475 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:09.692893 master-0 kubenswrapper[4813]: E1203 19:54:09.692623 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:09.692893 master-0 kubenswrapper[4813]: E1203 19:54:09.692682 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:10.692667204 +0000 UTC m=+95.121465643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:10.701015 master-0 kubenswrapper[4813]: I1203 19:54:10.700972 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:10.702369 master-0 kubenswrapper[4813]: E1203 19:54:10.701128 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:10.702369 master-0 kubenswrapper[4813]: E1203 19:54:10.701189 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:12.701174602 +0000 UTC m=+97.129973051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:11.023273 master-0 kubenswrapper[4813]: I1203 19:54:11.023207 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:11.023439 master-0 kubenswrapper[4813]: E1203 19:54:11.023398 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:11.035233 master-0 kubenswrapper[4813]: I1203 19:54:11.035161 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 19:54:11.312015 master-0 kubenswrapper[4813]: I1203 19:54:11.311870 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="9f0406d26b61880d05d604bbabebaeef16d5bda27cf4f4f9e097201539e44456" exitCode=0 Dec 03 19:54:11.312183 master-0 kubenswrapper[4813]: I1203 19:54:11.312017 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"9f0406d26b61880d05d604bbabebaeef16d5bda27cf4f4f9e097201539e44456"} Dec 03 19:54:11.332083 master-0 kubenswrapper[4813]: I1203 19:54:11.332012 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=0.331990681 podStartE2EDuration="331.990681ms" podCreationTimestamp="2025-12-03 19:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:54:11.331281635 +0000 UTC m=+95.760080124" watchObservedRunningTime="2025-12-03 19:54:11.331990681 +0000 UTC m=+95.760789150" Dec 03 19:54:12.037940 master-0 kubenswrapper[4813]: I1203 19:54:12.037888 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 19:54:12.717829 master-0 kubenswrapper[4813]: I1203 19:54:12.717761 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:12.718051 master-0 kubenswrapper[4813]: E1203 19:54:12.717970 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:12.718211 master-0 kubenswrapper[4813]: E1203 19:54:12.718181 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:16.718151435 +0000 UTC m=+101.146949914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:13.022879 master-0 kubenswrapper[4813]: I1203 19:54:13.022750 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:13.023096 master-0 kubenswrapper[4813]: E1203 19:54:13.022888 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:15.022763 master-0 kubenswrapper[4813]: I1203 19:54:15.022701 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:15.023444 master-0 kubenswrapper[4813]: E1203 19:54:15.022863 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:16.165469 master-0 kubenswrapper[4813]: I1203 19:54:16.165205 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=4.165176879 podStartE2EDuration="4.165176879s" podCreationTimestamp="2025-12-03 19:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:54:16.164475443 +0000 UTC m=+100.593273942" watchObservedRunningTime="2025-12-03 19:54:16.165176879 +0000 UTC m=+100.593975368" Dec 03 19:54:16.749008 master-0 kubenswrapper[4813]: I1203 19:54:16.748948 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:16.749210 master-0 kubenswrapper[4813]: E1203 19:54:16.749066 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:16.749210 master-0 kubenswrapper[4813]: E1203 19:54:16.749124 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:24.74910712 +0000 UTC m=+109.177905569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:17.023513 master-0 kubenswrapper[4813]: I1203 19:54:17.023406 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:17.024175 master-0 kubenswrapper[4813]: E1203 19:54:17.023908 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:17.024175 master-0 kubenswrapper[4813]: I1203 19:54:17.023986 4813 scope.go:117] "RemoveContainer" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" Dec 03 19:54:17.035016 master-0 kubenswrapper[4813]: I1203 19:54:17.034971 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 19:54:19.022962 master-0 kubenswrapper[4813]: I1203 19:54:19.022903 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:19.023451 master-0 kubenswrapper[4813]: E1203 19:54:19.023064 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:19.330644 master-0 kubenswrapper[4813]: I1203 19:54:19.330596 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/3.log" Dec 03 19:54:19.331632 master-0 kubenswrapper[4813]: I1203 19:54:19.331104 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"613dced068ceb2df4bbce683ccba9c87ef2fc3f6a3e401852118424ac1bf3a4c"} Dec 03 19:54:19.342340 master-0 kubenswrapper[4813]: I1203 19:54:19.342230 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=2.342215008 podStartE2EDuration="2.342215008s" podCreationTimestamp="2025-12-03 19:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:54:19.341081451 +0000 UTC m=+103.769879900" watchObservedRunningTime="2025-12-03 19:54:19.342215008 +0000 UTC m=+103.771013447" Dec 03 19:54:19.350879 master-0 kubenswrapper[4813]: I1203 19:54:19.350814 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=27.350789222 podStartE2EDuration="27.350789222s" podCreationTimestamp="2025-12-03 19:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:54:19.350770222 +0000 UTC m=+103.779568671" watchObservedRunningTime="2025-12-03 19:54:19.350789222 +0000 UTC m=+103.779587681" Dec 03 19:54:20.340705 master-0 kubenswrapper[4813]: I1203 19:54:20.340644 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="afd903622e2f7d6d9391f2df58084fdf90b41e4e17808cb5e2d5c792f644b6df" exitCode=0 Dec 03 19:54:20.341666 master-0 kubenswrapper[4813]: I1203 19:54:20.340745 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"afd903622e2f7d6d9391f2df58084fdf90b41e4e17808cb5e2d5c792f644b6df"} Dec 03 19:54:20.344115 master-0 kubenswrapper[4813]: I1203 19:54:20.343669 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9sdj" event={"ID":"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6","Type":"ContainerStarted","Data":"8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6"} Dec 03 19:54:20.389664 master-0 kubenswrapper[4813]: I1203 19:54:20.389515 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p9sdj" podStartSLOduration=1.543660396 podStartE2EDuration="12.389430449s" podCreationTimestamp="2025-12-03 19:54:08 +0000 UTC" firstStartedPulling="2025-12-03 19:54:08.363721006 +0000 UTC m=+92.792519495" lastFinishedPulling="2025-12-03 19:54:19.209491049 +0000 UTC m=+103.638289548" observedRunningTime="2025-12-03 19:54:20.389226044 +0000 UTC m=+104.818024553" watchObservedRunningTime="2025-12-03 19:54:20.389430449 +0000 UTC m=+104.818228938" Dec 03 19:54:20.407131 master-0 kubenswrapper[4813]: I1203 19:54:20.407037 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg"] Dec 03 19:54:20.407618 master-0 kubenswrapper[4813]: I1203 19:54:20.407563 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.410421 master-0 kubenswrapper[4813]: I1203 19:54:20.410373 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 19:54:20.410761 master-0 kubenswrapper[4813]: I1203 19:54:20.410714 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 19:54:20.411126 master-0 kubenswrapper[4813]: I1203 19:54:20.411099 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 19:54:20.414607 master-0 kubenswrapper[4813]: I1203 19:54:20.414482 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 19:54:20.417524 master-0 kubenswrapper[4813]: I1203 19:54:20.417473 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 19:54:20.479515 master-0 kubenswrapper[4813]: I1203 19:54:20.479444 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.479515 master-0 kubenswrapper[4813]: I1203 19:54:20.479518 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.479858 master-0 kubenswrapper[4813]: I1203 19:54:20.479635 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.479858 master-0 kubenswrapper[4813]: I1203 19:54:20.479691 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.580155 master-0 kubenswrapper[4813]: I1203 19:54:20.580074 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.580155 master-0 kubenswrapper[4813]: I1203 19:54:20.580151 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.580549 master-0 kubenswrapper[4813]: I1203 19:54:20.580224 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.580549 master-0 kubenswrapper[4813]: I1203 19:54:20.580267 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.581443 master-0 kubenswrapper[4813]: I1203 19:54:20.581368 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.581598 master-0 kubenswrapper[4813]: I1203 19:54:20.581532 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.585836 master-0 kubenswrapper[4813]: I1203 19:54:20.585410 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.600992 master-0 kubenswrapper[4813]: I1203 19:54:20.600877 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.606310 master-0 kubenswrapper[4813]: I1203 19:54:20.606228 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xghq7"] Dec 03 19:54:20.607807 master-0 kubenswrapper[4813]: I1203 19:54:20.607706 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.609685 master-0 kubenswrapper[4813]: I1203 19:54:20.609620 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 19:54:20.609883 master-0 kubenswrapper[4813]: I1203 19:54:20.609873 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 19:54:20.681173 master-0 kubenswrapper[4813]: I1203 19:54:20.681074 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681173 master-0 kubenswrapper[4813]: I1203 19:54:20.681145 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681173 master-0 kubenswrapper[4813]: I1203 19:54:20.681181 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681216 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681264 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681312 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681358 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681423 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681514 master-0 kubenswrapper[4813]: I1203 19:54:20.681473 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681567 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681622 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8d6\" (UniqueName: \"kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681658 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681717 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681765 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681841 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681880 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681916 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.681974 master-0 kubenswrapper[4813]: I1203 19:54:20.681948 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.682467 master-0 kubenswrapper[4813]: I1203 19:54:20.681989 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.682467 master-0 kubenswrapper[4813]: E1203 19:54:20.682124 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:20.682467 master-0 kubenswrapper[4813]: I1203 19:54:20.682140 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.682467 master-0 kubenswrapper[4813]: E1203 19:54:20.682236 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:54:52.682201469 +0000 UTC m=+137.110999948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:20.682467 master-0 kubenswrapper[4813]: I1203 19:54:20.682269 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.727360 master-0 kubenswrapper[4813]: I1203 19:54:20.727250 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:54:20.746446 master-0 kubenswrapper[4813]: W1203 19:54:20.746378 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd210062f_c07e_419f_a551_c37571565686.slice/crio-b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d WatchSource:0}: Error finding container b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d: Status 404 returned error can't find the container with id b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d Dec 03 19:54:20.783321 master-0 kubenswrapper[4813]: I1203 19:54:20.783204 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783321 master-0 kubenswrapper[4813]: I1203 19:54:20.783295 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783347 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783364 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783414 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783430 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783666 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783636 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783743 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783713 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783820 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.783874 master-0 kubenswrapper[4813]: I1203 19:54:20.783859 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.783912 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.783963 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784017 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.783961 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784058 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784064 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784103 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784110 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784144 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784144 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784205 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784216 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784239 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784259 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784281 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784305 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.784695 master-0 kubenswrapper[4813]: I1203 19:54:20.784338 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784361 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8d6\" (UniqueName: \"kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784378 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784428 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784695 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784746 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.784769 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.785399 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.785935 master-0 kubenswrapper[4813]: I1203 19:54:20.785657 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.786376 master-0 kubenswrapper[4813]: I1203 19:54:20.785976 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.791900 master-0 kubenswrapper[4813]: I1203 19:54:20.791834 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.816267 master-0 kubenswrapper[4813]: I1203 19:54:20.816158 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8d6\" (UniqueName: \"kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6\") pod \"ovnkube-node-xghq7\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.921339 master-0 kubenswrapper[4813]: I1203 19:54:20.921297 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:20.940037 master-0 kubenswrapper[4813]: W1203 19:54:20.939991 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5c813b_2e17_4004_bf26_a26c26a5ed8f.slice/crio-46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20 WatchSource:0}: Error finding container 46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20: Status 404 returned error can't find the container with id 46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20 Dec 03 19:54:21.022828 master-0 kubenswrapper[4813]: I1203 19:54:21.022720 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:21.023110 master-0 kubenswrapper[4813]: E1203 19:54:21.022921 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:21.348593 master-0 kubenswrapper[4813]: I1203 19:54:21.348478 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" event={"ID":"d210062f-c07e-419f-a551-c37571565686","Type":"ContainerStarted","Data":"841bb03fe2251dcb7a74b2bc67fd9479ba5b31babb8eac132a16e2abde34e62f"} Dec 03 19:54:21.348593 master-0 kubenswrapper[4813]: I1203 19:54:21.348530 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" event={"ID":"d210062f-c07e-419f-a551-c37571565686","Type":"ContainerStarted","Data":"b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d"} Dec 03 19:54:21.350715 master-0 kubenswrapper[4813]: I1203 19:54:21.349659 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20"} Dec 03 19:54:22.354110 master-0 kubenswrapper[4813]: I1203 19:54:22.354021 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="5e06cf682588907f65a412d4ac6d4481e139ecf6ab4739442acce6158ba8872d" exitCode=0 Dec 03 19:54:22.354110 master-0 kubenswrapper[4813]: I1203 19:54:22.354068 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"5e06cf682588907f65a412d4ac6d4481e139ecf6ab4739442acce6158ba8872d"} Dec 03 19:54:23.022428 master-0 kubenswrapper[4813]: I1203 19:54:23.022372 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:23.022623 master-0 kubenswrapper[4813]: E1203 19:54:23.022501 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:23.583144 master-0 kubenswrapper[4813]: I1203 19:54:23.583090 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-x6vwd"] Dec 03 19:54:23.583541 master-0 kubenswrapper[4813]: I1203 19:54:23.583447 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:23.583541 master-0 kubenswrapper[4813]: E1203 19:54:23.583522 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:23.709640 master-0 kubenswrapper[4813]: I1203 19:54:23.709531 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:23.810432 master-0 kubenswrapper[4813]: I1203 19:54:23.810347 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:23.822957 master-0 kubenswrapper[4813]: E1203 19:54:23.822877 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:23.822957 master-0 kubenswrapper[4813]: E1203 19:54:23.822925 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:23.822957 master-0 kubenswrapper[4813]: E1203 19:54:23.822946 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:23.823327 master-0 kubenswrapper[4813]: E1203 19:54:23.823023 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:24.323003212 +0000 UTC m=+108.751801671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:24.361463 master-0 kubenswrapper[4813]: I1203 19:54:24.361391 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="3e816effb094becdc3c407acbb3f9f27817216cdbfc7352da3c72fba2c274e3e" exitCode=0 Dec 03 19:54:24.361463 master-0 kubenswrapper[4813]: I1203 19:54:24.361455 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"3e816effb094becdc3c407acbb3f9f27817216cdbfc7352da3c72fba2c274e3e"} Dec 03 19:54:24.416474 master-0 kubenswrapper[4813]: I1203 19:54:24.414563 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:24.416474 master-0 kubenswrapper[4813]: E1203 19:54:24.414673 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:24.416474 master-0 kubenswrapper[4813]: E1203 19:54:24.414687 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:24.416474 master-0 kubenswrapper[4813]: E1203 19:54:24.414697 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:24.416474 master-0 kubenswrapper[4813]: E1203 19:54:24.414740 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:25.414727519 +0000 UTC m=+109.843525968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:24.817901 master-0 kubenswrapper[4813]: I1203 19:54:24.817514 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:24.818579 master-0 kubenswrapper[4813]: E1203 19:54:24.817923 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:24.818579 master-0 kubenswrapper[4813]: E1203 19:54:24.818060 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:40.818038178 +0000 UTC m=+125.246836637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:25.022825 master-0 kubenswrapper[4813]: I1203 19:54:25.022772 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:25.023027 master-0 kubenswrapper[4813]: I1203 19:54:25.022834 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:25.023027 master-0 kubenswrapper[4813]: E1203 19:54:25.022904 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:25.023027 master-0 kubenswrapper[4813]: E1203 19:54:25.022987 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:25.421754 master-0 kubenswrapper[4813]: I1203 19:54:25.421668 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:25.422038 master-0 kubenswrapper[4813]: E1203 19:54:25.421830 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:25.422038 master-0 kubenswrapper[4813]: E1203 19:54:25.421847 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:25.422038 master-0 kubenswrapper[4813]: E1203 19:54:25.421859 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:25.422038 master-0 kubenswrapper[4813]: E1203 19:54:25.421910 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:27.421895134 +0000 UTC m=+111.850693583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:26.197178 master-0 kubenswrapper[4813]: I1203 19:54:26.197133 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-r2kpn"] Dec 03 19:54:26.197708 master-0 kubenswrapper[4813]: I1203 19:54:26.197519 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.201410 master-0 kubenswrapper[4813]: I1203 19:54:26.198888 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 19:54:26.201410 master-0 kubenswrapper[4813]: I1203 19:54:26.199679 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 19:54:26.201410 master-0 kubenswrapper[4813]: I1203 19:54:26.199932 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 19:54:26.201410 master-0 kubenswrapper[4813]: I1203 19:54:26.199963 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 19:54:26.201410 master-0 kubenswrapper[4813]: I1203 19:54:26.199936 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 19:54:26.228661 master-0 kubenswrapper[4813]: I1203 19:54:26.228622 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.228842 master-0 kubenswrapper[4813]: I1203 19:54:26.228689 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.228842 master-0 kubenswrapper[4813]: I1203 19:54:26.228718 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.228842 master-0 kubenswrapper[4813]: I1203 19:54:26.228735 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.329928 master-0 kubenswrapper[4813]: I1203 19:54:26.329881 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.329928 master-0 kubenswrapper[4813]: I1203 19:54:26.329926 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.329928 master-0 kubenswrapper[4813]: I1203 19:54:26.329944 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.330191 master-0 kubenswrapper[4813]: I1203 19:54:26.329989 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.330262 master-0 kubenswrapper[4813]: E1203 19:54:26.330230 4813 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Dec 03 19:54:26.330362 master-0 kubenswrapper[4813]: E1203 19:54:26.330281 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert podName:c4d45235-fb1a-4626-a41e-b1e34f7bf76e nodeName:}" failed. No retries permitted until 2025-12-03 19:54:26.830268441 +0000 UTC m=+111.259066890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert") pod "network-node-identity-r2kpn" (UID: "c4d45235-fb1a-4626-a41e-b1e34f7bf76e") : secret "network-node-identity-cert" not found Dec 03 19:54:26.331972 master-0 kubenswrapper[4813]: I1203 19:54:26.331938 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.333035 master-0 kubenswrapper[4813]: I1203 19:54:26.332989 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.349005 master-0 kubenswrapper[4813]: I1203 19:54:26.348962 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.834067 master-0 kubenswrapper[4813]: I1203 19:54:26.833998 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:26.837739 master-0 kubenswrapper[4813]: I1203 19:54:26.837680 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:27.022904 master-0 kubenswrapper[4813]: I1203 19:54:27.022746 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:27.023116 master-0 kubenswrapper[4813]: E1203 19:54:27.022902 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:27.023601 master-0 kubenswrapper[4813]: I1203 19:54:27.023235 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:27.023601 master-0 kubenswrapper[4813]: E1203 19:54:27.023549 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:27.114046 master-0 kubenswrapper[4813]: I1203 19:54:27.113993 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:54:27.124356 master-0 kubenswrapper[4813]: W1203 19:54:27.124314 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d45235_fb1a_4626_a41e_b1e34f7bf76e.slice/crio-75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214 WatchSource:0}: Error finding container 75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214: Status 404 returned error can't find the container with id 75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214 Dec 03 19:54:27.372683 master-0 kubenswrapper[4813]: I1203 19:54:27.372561 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r2kpn" event={"ID":"c4d45235-fb1a-4626-a41e-b1e34f7bf76e","Type":"ContainerStarted","Data":"75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214"} Dec 03 19:54:27.439546 master-0 kubenswrapper[4813]: I1203 19:54:27.439496 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:27.439726 master-0 kubenswrapper[4813]: E1203 19:54:27.439654 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:27.439726 master-0 kubenswrapper[4813]: E1203 19:54:27.439672 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:27.439726 master-0 kubenswrapper[4813]: E1203 19:54:27.439686 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:27.439873 master-0 kubenswrapper[4813]: E1203 19:54:27.439734 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:31.439717288 +0000 UTC m=+115.868515737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:29.023056 master-0 kubenswrapper[4813]: I1203 19:54:29.023002 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:29.023056 master-0 kubenswrapper[4813]: I1203 19:54:29.023038 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:29.023621 master-0 kubenswrapper[4813]: E1203 19:54:29.023149 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:29.023621 master-0 kubenswrapper[4813]: E1203 19:54:29.023263 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:31.023037 master-0 kubenswrapper[4813]: I1203 19:54:31.022970 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:31.024266 master-0 kubenswrapper[4813]: I1203 19:54:31.023137 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:31.024266 master-0 kubenswrapper[4813]: E1203 19:54:31.023325 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:31.024266 master-0 kubenswrapper[4813]: E1203 19:54:31.023494 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:31.508182 master-0 kubenswrapper[4813]: I1203 19:54:31.508069 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:31.508482 master-0 kubenswrapper[4813]: E1203 19:54:31.508322 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:31.508482 master-0 kubenswrapper[4813]: E1203 19:54:31.508349 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:31.508482 master-0 kubenswrapper[4813]: E1203 19:54:31.508364 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:31.508482 master-0 kubenswrapper[4813]: E1203 19:54:31.508431 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:39.508411294 +0000 UTC m=+123.937209743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:33.023363 master-0 kubenswrapper[4813]: I1203 19:54:33.023295 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:33.023899 master-0 kubenswrapper[4813]: I1203 19:54:33.023307 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:33.023899 master-0 kubenswrapper[4813]: E1203 19:54:33.023410 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:33.023899 master-0 kubenswrapper[4813]: E1203 19:54:33.023528 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:35.023244 master-0 kubenswrapper[4813]: I1203 19:54:35.023144 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:35.023244 master-0 kubenswrapper[4813]: I1203 19:54:35.023189 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:35.023819 master-0 kubenswrapper[4813]: E1203 19:54:35.023298 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:35.023819 master-0 kubenswrapper[4813]: E1203 19:54:35.023413 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:35.859315 master-0 kubenswrapper[4813]: E1203 19:54:35.858287 4813 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 19:54:35.980070 master-0 kubenswrapper[4813]: E1203 19:54:35.980014 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:54:37.023265 master-0 kubenswrapper[4813]: I1203 19:54:37.023208 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:37.023694 master-0 kubenswrapper[4813]: I1203 19:54:37.023228 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:37.023694 master-0 kubenswrapper[4813]: E1203 19:54:37.023367 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:37.023694 master-0 kubenswrapper[4813]: E1203 19:54:37.023488 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:38.399093 master-0 kubenswrapper[4813]: I1203 19:54:38.398807 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="a396f10beccb65f07ed52d9f7eed56b73ee45537150d1fb69cde98622f0ce32a" exitCode=0 Dec 03 19:54:38.399802 master-0 kubenswrapper[4813]: I1203 19:54:38.399112 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"a396f10beccb65f07ed52d9f7eed56b73ee45537150d1fb69cde98622f0ce32a"} Dec 03 19:54:38.405259 master-0 kubenswrapper[4813]: I1203 19:54:38.405218 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" exitCode=0 Dec 03 19:54:38.405347 master-0 kubenswrapper[4813]: I1203 19:54:38.405279 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564"} Dec 03 19:54:38.420082 master-0 kubenswrapper[4813]: I1203 19:54:38.419988 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" event={"ID":"d210062f-c07e-419f-a551-c37571565686","Type":"ContainerStarted","Data":"2d7be3731fbc745283a2d759f396c31ac1367c0ba714305c646e32b354747fdc"} Dec 03 19:54:38.478657 master-0 kubenswrapper[4813]: I1203 19:54:38.478147 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" podStartSLOduration=1.7310243490000001 podStartE2EDuration="18.478128949s" podCreationTimestamp="2025-12-03 19:54:20 +0000 UTC" firstStartedPulling="2025-12-03 19:54:20.983198835 +0000 UTC m=+105.411997284" lastFinishedPulling="2025-12-03 19:54:37.730303435 +0000 UTC m=+122.159101884" observedRunningTime="2025-12-03 19:54:38.456564284 +0000 UTC m=+122.885362743" watchObservedRunningTime="2025-12-03 19:54:38.478128949 +0000 UTC m=+122.906927398" Dec 03 19:54:39.023171 master-0 kubenswrapper[4813]: I1203 19:54:39.022857 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:39.023449 master-0 kubenswrapper[4813]: E1203 19:54:39.023208 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:39.023449 master-0 kubenswrapper[4813]: I1203 19:54:39.022950 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:39.023449 master-0 kubenswrapper[4813]: E1203 19:54:39.023415 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:39.430532 master-0 kubenswrapper[4813]: I1203 19:54:39.430295 4813 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="b2c2ebffcad93a655874c4b2c0e0dae1edf07cc0c8e231705d220b5fe6aadf15" exitCode=0 Dec 03 19:54:39.430532 master-0 kubenswrapper[4813]: I1203 19:54:39.430431 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerDied","Data":"b2c2ebffcad93a655874c4b2c0e0dae1edf07cc0c8e231705d220b5fe6aadf15"} Dec 03 19:54:39.433993 master-0 kubenswrapper[4813]: I1203 19:54:39.433928 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r2kpn" event={"ID":"c4d45235-fb1a-4626-a41e-b1e34f7bf76e","Type":"ContainerStarted","Data":"65f13f5f310f6f953b71a1a783c24c03bd5eb6d2106c3ba74515208177e8e054"} Dec 03 19:54:39.434107 master-0 kubenswrapper[4813]: I1203 19:54:39.433989 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r2kpn" event={"ID":"c4d45235-fb1a-4626-a41e-b1e34f7bf76e","Type":"ContainerStarted","Data":"237b9bf15f47012be6be63ea09466b3e35a44c39709a1a2ddc2728c5f28b6537"} Dec 03 19:54:39.444313 master-0 kubenswrapper[4813]: I1203 19:54:39.444251 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} Dec 03 19:54:39.444313 master-0 kubenswrapper[4813]: I1203 19:54:39.444305 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} Dec 03 19:54:39.444526 master-0 kubenswrapper[4813]: I1203 19:54:39.444324 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} Dec 03 19:54:39.444526 master-0 kubenswrapper[4813]: I1203 19:54:39.444348 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} Dec 03 19:54:39.444526 master-0 kubenswrapper[4813]: I1203 19:54:39.444365 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} Dec 03 19:54:39.444526 master-0 kubenswrapper[4813]: I1203 19:54:39.444386 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} Dec 03 19:54:39.483695 master-0 kubenswrapper[4813]: I1203 19:54:39.483546 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-r2kpn" podStartSLOduration=2.458328575 podStartE2EDuration="13.483516121s" podCreationTimestamp="2025-12-03 19:54:26 +0000 UTC" firstStartedPulling="2025-12-03 19:54:27.126333216 +0000 UTC m=+111.555131665" lastFinishedPulling="2025-12-03 19:54:38.151520762 +0000 UTC m=+122.580319211" observedRunningTime="2025-12-03 19:54:39.482904317 +0000 UTC m=+123.911702826" watchObservedRunningTime="2025-12-03 19:54:39.483516121 +0000 UTC m=+123.912314600" Dec 03 19:54:39.512343 master-0 kubenswrapper[4813]: I1203 19:54:39.512235 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:39.512578 master-0 kubenswrapper[4813]: E1203 19:54:39.512484 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:39.512578 master-0 kubenswrapper[4813]: E1203 19:54:39.512523 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:39.512578 master-0 kubenswrapper[4813]: E1203 19:54:39.512543 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:39.512766 master-0 kubenswrapper[4813]: E1203 19:54:39.512625 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:54:55.512602346 +0000 UTC m=+139.941400825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:40.453846 master-0 kubenswrapper[4813]: I1203 19:54:40.453723 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" event={"ID":"87f1759a-7df4-442e-a22d-6de8d54be333","Type":"ContainerStarted","Data":"8b214efba2343f1627cde90c337ab4b6277995196f54dd57ac66f0a685440416"} Dec 03 19:54:40.826029 master-0 kubenswrapper[4813]: I1203 19:54:40.825396 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:40.826029 master-0 kubenswrapper[4813]: E1203 19:54:40.825644 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:40.826029 master-0 kubenswrapper[4813]: E1203 19:54:40.825975 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:12.825956691 +0000 UTC m=+157.254755150 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 19:54:40.982343 master-0 kubenswrapper[4813]: E1203 19:54:40.982209 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:54:41.022931 master-0 kubenswrapper[4813]: I1203 19:54:41.022843 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:41.022931 master-0 kubenswrapper[4813]: I1203 19:54:41.022894 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:41.023244 master-0 kubenswrapper[4813]: E1203 19:54:41.023059 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:41.023244 master-0 kubenswrapper[4813]: E1203 19:54:41.023203 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:41.464289 master-0 kubenswrapper[4813]: I1203 19:54:41.464178 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} Dec 03 19:54:43.022547 master-0 kubenswrapper[4813]: I1203 19:54:43.022448 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:43.023380 master-0 kubenswrapper[4813]: E1203 19:54:43.022614 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:43.023380 master-0 kubenswrapper[4813]: I1203 19:54:43.022455 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:43.023380 master-0 kubenswrapper[4813]: E1203 19:54:43.022734 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:44.483249 master-0 kubenswrapper[4813]: I1203 19:54:44.483159 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerStarted","Data":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} Dec 03 19:54:44.484214 master-0 kubenswrapper[4813]: I1203 19:54:44.483373 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:44.484214 master-0 kubenswrapper[4813]: I1203 19:54:44.483565 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:44.484214 master-0 kubenswrapper[4813]: I1203 19:54:44.483635 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:44.539740 master-0 kubenswrapper[4813]: I1203 19:54:44.527290 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pwlw2" podStartSLOduration=7.466828559 podStartE2EDuration="36.527226477s" podCreationTimestamp="2025-12-03 19:54:08 +0000 UTC" firstStartedPulling="2025-12-03 19:54:08.555054615 +0000 UTC m=+92.983853104" lastFinishedPulling="2025-12-03 19:54:37.615452573 +0000 UTC m=+122.044251022" observedRunningTime="2025-12-03 19:54:40.47992123 +0000 UTC m=+124.908719739" watchObservedRunningTime="2025-12-03 19:54:44.527226477 +0000 UTC m=+128.956024966" Dec 03 19:54:44.539740 master-0 kubenswrapper[4813]: I1203 19:54:44.527819 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podStartSLOduration=7.775610147 podStartE2EDuration="24.52780645s" podCreationTimestamp="2025-12-03 19:54:20 +0000 UTC" firstStartedPulling="2025-12-03 19:54:20.945007073 +0000 UTC m=+105.373805562" lastFinishedPulling="2025-12-03 19:54:37.697203396 +0000 UTC m=+122.126001865" observedRunningTime="2025-12-03 19:54:44.524924011 +0000 UTC m=+128.953722520" watchObservedRunningTime="2025-12-03 19:54:44.52780645 +0000 UTC m=+128.956604959" Dec 03 19:54:44.545020 master-0 kubenswrapper[4813]: I1203 19:54:44.544954 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:44.545152 master-0 kubenswrapper[4813]: I1203 19:54:44.545071 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:45.023106 master-0 kubenswrapper[4813]: I1203 19:54:45.023064 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:45.023302 master-0 kubenswrapper[4813]: I1203 19:54:45.023175 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:45.023302 master-0 kubenswrapper[4813]: E1203 19:54:45.023200 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:45.023440 master-0 kubenswrapper[4813]: E1203 19:54:45.023397 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:45.982944 master-0 kubenswrapper[4813]: E1203 19:54:45.982825 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:54:46.084817 master-0 kubenswrapper[4813]: I1203 19:54:46.084699 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hs6gf"] Dec 03 19:54:46.085006 master-0 kubenswrapper[4813]: I1203 19:54:46.084952 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:46.085220 master-0 kubenswrapper[4813]: E1203 19:54:46.085152 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:46.089381 master-0 kubenswrapper[4813]: I1203 19:54:46.089286 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x6vwd"] Dec 03 19:54:46.089483 master-0 kubenswrapper[4813]: I1203 19:54:46.089437 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:46.089570 master-0 kubenswrapper[4813]: E1203 19:54:46.089530 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:48.023473 master-0 kubenswrapper[4813]: I1203 19:54:48.022755 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:48.023473 master-0 kubenswrapper[4813]: I1203 19:54:48.022866 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:48.024495 master-0 kubenswrapper[4813]: E1203 19:54:48.023473 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:48.024495 master-0 kubenswrapper[4813]: E1203 19:54:48.023592 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:50.022659 master-0 kubenswrapper[4813]: I1203 19:54:50.022563 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:50.023881 master-0 kubenswrapper[4813]: I1203 19:54:50.022592 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:50.023881 master-0 kubenswrapper[4813]: E1203 19:54:50.022752 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:50.023881 master-0 kubenswrapper[4813]: E1203 19:54:50.022854 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:50.146504 master-0 kubenswrapper[4813]: I1203 19:54:50.146037 4813 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xghq7"] Dec 03 19:54:50.146504 master-0 kubenswrapper[4813]: I1203 19:54:50.146387 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-controller" containerID="cri-o://18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" gracePeriod=30 Dec 03 19:54:50.146922 master-0 kubenswrapper[4813]: I1203 19:54:50.146759 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="sbdb" containerID="cri-o://7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" gracePeriod=30 Dec 03 19:54:50.146922 master-0 kubenswrapper[4813]: I1203 19:54:50.146809 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="nbdb" containerID="cri-o://c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" gracePeriod=30 Dec 03 19:54:50.146922 master-0 kubenswrapper[4813]: I1203 19:54:50.146845 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="northd" containerID="cri-o://58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" gracePeriod=30 Dec 03 19:54:50.146922 master-0 kubenswrapper[4813]: I1203 19:54:50.146890 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" gracePeriod=30 Dec 03 19:54:50.146922 master-0 kubenswrapper[4813]: I1203 19:54:50.146924 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-node" containerID="cri-o://024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" gracePeriod=30 Dec 03 19:54:50.147334 master-0 kubenswrapper[4813]: I1203 19:54:50.146955 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-acl-logging" containerID="cri-o://28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" gracePeriod=30 Dec 03 19:54:50.177248 master-0 kubenswrapper[4813]: I1203 19:54:50.177195 4813 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" containerID="cri-o://19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" gracePeriod=30 Dec 03 19:54:50.182437 master-0 kubenswrapper[4813]: I1203 19:54:50.182314 4813 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" probeResult="failure" output="" Dec 03 19:54:50.504462 master-0 kubenswrapper[4813]: I1203 19:54:50.504393 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 19:54:50.505000 master-0 kubenswrapper[4813]: I1203 19:54:50.504975 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-node/0.log" Dec 03 19:54:50.505447 master-0 kubenswrapper[4813]: I1203 19:54:50.505426 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-acl-logging/0.log" Dec 03 19:54:50.506001 master-0 kubenswrapper[4813]: I1203 19:54:50.505955 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-controller/0.log" Dec 03 19:54:50.506491 master-0 kubenswrapper[4813]: I1203 19:54:50.506460 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" exitCode=143 Dec 03 19:54:50.506491 master-0 kubenswrapper[4813]: I1203 19:54:50.506482 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" exitCode=143 Dec 03 19:54:50.506491 master-0 kubenswrapper[4813]: I1203 19:54:50.506490 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" exitCode=143 Dec 03 19:54:50.506583 master-0 kubenswrapper[4813]: I1203 19:54:50.506498 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" exitCode=143 Dec 03 19:54:50.506583 master-0 kubenswrapper[4813]: I1203 19:54:50.506517 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} Dec 03 19:54:50.506583 master-0 kubenswrapper[4813]: I1203 19:54:50.506541 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} Dec 03 19:54:50.506583 master-0 kubenswrapper[4813]: I1203 19:54:50.506551 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} Dec 03 19:54:50.506583 master-0 kubenswrapper[4813]: I1203 19:54:50.506562 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} Dec 03 19:54:50.922990 master-0 kubenswrapper[4813]: E1203 19:54:50.922812 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 is running failed: container process not found" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 19:54:50.923190 master-0 kubenswrapper[4813]: E1203 19:54:50.922982 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 is running failed: container process not found" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 03 19:54:50.923190 master-0 kubenswrapper[4813]: E1203 19:54:50.923104 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d is running failed: container process not found" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 19:54:50.923489 master-0 kubenswrapper[4813]: E1203 19:54:50.923428 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 is running failed: container process not found" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 03 19:54:50.923574 master-0 kubenswrapper[4813]: E1203 19:54:50.923467 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 is running failed: container process not found" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 19:54:50.923652 master-0 kubenswrapper[4813]: E1203 19:54:50.923605 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d is running failed: container process not found" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 19:54:50.923913 master-0 kubenswrapper[4813]: E1203 19:54:50.923851 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 is running failed: container process not found" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 03 19:54:50.924069 master-0 kubenswrapper[4813]: E1203 19:54:50.923913 4813 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" Dec 03 19:54:50.924232 master-0 kubenswrapper[4813]: E1203 19:54:50.924163 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d is running failed: container process not found" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 19:54:50.924232 master-0 kubenswrapper[4813]: E1203 19:54:50.924214 4813 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="sbdb" Dec 03 19:54:50.924407 master-0 kubenswrapper[4813]: E1203 19:54:50.924286 4813 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 is running failed: container process not found" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 19:54:50.924407 master-0 kubenswrapper[4813]: E1203 19:54:50.924316 4813 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="nbdb" Dec 03 19:54:50.984244 master-0 kubenswrapper[4813]: E1203 19:54:50.984097 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:54:51.499056 master-0 kubenswrapper[4813]: I1203 19:54:51.498671 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 19:54:51.499973 master-0 kubenswrapper[4813]: I1203 19:54:51.499695 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-node/0.log" Dec 03 19:54:51.500336 master-0 kubenswrapper[4813]: I1203 19:54:51.500291 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-acl-logging/0.log" Dec 03 19:54:51.501008 master-0 kubenswrapper[4813]: I1203 19:54:51.500951 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-controller/0.log" Dec 03 19:54:51.501585 master-0 kubenswrapper[4813]: I1203 19:54:51.501525 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:51.513211 master-0 kubenswrapper[4813]: I1203 19:54:51.513087 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 19:54:51.513835 master-0 kubenswrapper[4813]: I1203 19:54:51.513748 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/kube-rbac-proxy-node/0.log" Dec 03 19:54:51.514266 master-0 kubenswrapper[4813]: I1203 19:54:51.514227 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-acl-logging/0.log" Dec 03 19:54:51.514830 master-0 kubenswrapper[4813]: I1203 19:54:51.514800 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xghq7_fb5c813b-2e17-4004-bf26-a26c26a5ed8f/ovn-controller/0.log" Dec 03 19:54:51.515237 master-0 kubenswrapper[4813]: I1203 19:54:51.515186 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" exitCode=0 Dec 03 19:54:51.515237 master-0 kubenswrapper[4813]: I1203 19:54:51.515220 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" exitCode=0 Dec 03 19:54:51.515237 master-0 kubenswrapper[4813]: I1203 19:54:51.515231 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" exitCode=0 Dec 03 19:54:51.515237 master-0 kubenswrapper[4813]: I1203 19:54:51.515238 4813 generic.go:334] "Generic (PLEG): container finished" podID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" exitCode=0 Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515257 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515263 4813 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515292 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515309 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515324 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515335 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xghq7" event={"ID":"fb5c813b-2e17-4004-bf26-a26c26a5ed8f","Type":"ContainerDied","Data":"46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20"} Dec 03 19:54:51.515542 master-0 kubenswrapper[4813]: I1203 19:54:51.515352 4813 scope.go:117] "RemoveContainer" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.531574 master-0 kubenswrapper[4813]: I1203 19:54:51.531529 4813 scope.go:117] "RemoveContainer" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.544725 master-0 kubenswrapper[4813]: I1203 19:54:51.544653 4813 scope.go:117] "RemoveContainer" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.553271 master-0 kubenswrapper[4813]: I1203 19:54:51.553169 4813 scope.go:117] "RemoveContainer" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.563484 master-0 kubenswrapper[4813]: I1203 19:54:51.563439 4813 scope.go:117] "RemoveContainer" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.572400 master-0 kubenswrapper[4813]: I1203 19:54:51.572364 4813 scope.go:117] "RemoveContainer" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.584909 master-0 kubenswrapper[4813]: I1203 19:54:51.584867 4813 scope.go:117] "RemoveContainer" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.597646 master-0 kubenswrapper[4813]: I1203 19:54:51.597606 4813 scope.go:117] "RemoveContainer" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.607039 master-0 kubenswrapper[4813]: I1203 19:54:51.606988 4813 scope.go:117] "RemoveContainer" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.613292 master-0 kubenswrapper[4813]: I1203 19:54:51.613234 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.613292 master-0 kubenswrapper[4813]: I1203 19:54:51.613272 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.613292 master-0 kubenswrapper[4813]: I1203 19:54:51.613295 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613324 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613352 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613381 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613414 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613452 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613449 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log" (OuterVolumeSpecName: "node-log") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613485 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613503 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket" (OuterVolumeSpecName: "log-socket") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613509 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613543 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613528 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613564 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613585 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613597 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613621 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.614771 master-0 kubenswrapper[4813]: I1203 19:54:51.613643 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613678 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613699 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613758 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts8d6\" (UniqueName: \"kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613803 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613824 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613846 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613875 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613896 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613915 4813 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch\") pod \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\" (UID: \"fb5c813b-2e17-4004-bf26-a26c26a5ed8f\") " Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.613929 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614137 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614097 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614072 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614157 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614117 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash" (OuterVolumeSpecName: "host-slash") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614072 4813 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614213 4813 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-netns\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.616199 master-0 kubenswrapper[4813]: I1203 19:54:51.614229 4813 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614247 4813 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614261 4813 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-log-socket\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614274 4813 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-node-log\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614287 4813 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-systemd-units\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614123 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614299 4813 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614192 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.614442 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.616451 4813 scope.go:117] "RemoveContainer" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: E1203 19:54:51.616911 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": container with ID starting with 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 not found: ID does not exist" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.616938 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} err="failed to get container status \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": rpc error: code = NotFound desc = could not find container \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": container with ID starting with 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 not found: ID does not exist" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.616979 4813 scope.go:117] "RemoveContainer" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: E1203 19:54:51.617311 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": container with ID starting with 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d not found: ID does not exist" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.617345 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} err="failed to get container status \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": rpc error: code = NotFound desc = could not find container \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": container with ID starting with 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d not found: ID does not exist" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.617358 4813 scope.go:117] "RemoveContainer" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: E1203 19:54:51.617549 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": container with ID starting with c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 not found: ID does not exist" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.617565 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} err="failed to get container status \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": rpc error: code = NotFound desc = could not find container \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": container with ID starting with c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 not found: ID does not exist" Dec 03 19:54:51.617718 master-0 kubenswrapper[4813]: I1203 19:54:51.617577 4813 scope.go:117] "RemoveContainer" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: E1203 19:54:51.617743 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": container with ID starting with 58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b not found: ID does not exist" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.617759 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} err="failed to get container status \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": rpc error: code = NotFound desc = could not find container \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": container with ID starting with 58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b not found: ID does not exist" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.617771 4813 scope.go:117] "RemoveContainer" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: E1203 19:54:51.618170 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": container with ID starting with e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5 not found: ID does not exist" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618192 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} err="failed to get container status \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": rpc error: code = NotFound desc = could not find container \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": container with ID starting with e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5 not found: ID does not exist" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618206 4813 scope.go:117] "RemoveContainer" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: E1203 19:54:51.618500 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": container with ID starting with 024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638 not found: ID does not exist" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618515 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} err="failed to get container status \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": rpc error: code = NotFound desc = could not find container \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": container with ID starting with 024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638 not found: ID does not exist" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618528 4813 scope.go:117] "RemoveContainer" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: E1203 19:54:51.618910 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": container with ID starting with 28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2 not found: ID does not exist" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618927 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} err="failed to get container status \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": rpc error: code = NotFound desc = could not find container \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": container with ID starting with 28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2 not found: ID does not exist" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.618939 4813 scope.go:117] "RemoveContainer" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: E1203 19:54:51.619177 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": container with ID starting with 18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020 not found: ID does not exist" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.619193 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} err="failed to get container status \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": rpc error: code = NotFound desc = could not find container \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": container with ID starting with 18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020 not found: ID does not exist" Dec 03 19:54:51.619190 master-0 kubenswrapper[4813]: I1203 19:54:51.619205 4813 scope.go:117] "RemoveContainer" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: E1203 19:54:51.619492 4813 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": container with ID starting with e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564 not found: ID does not exist" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619508 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564"} err="failed to get container status \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": rpc error: code = NotFound desc = could not find container \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": container with ID starting with e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564 not found: ID does not exist" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619520 4813 scope.go:117] "RemoveContainer" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619763 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} err="failed to get container status \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": rpc error: code = NotFound desc = could not find container \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": container with ID starting with 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 not found: ID does not exist" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619788 4813 scope.go:117] "RemoveContainer" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619966 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} err="failed to get container status \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": rpc error: code = NotFound desc = could not find container \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": container with ID starting with 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d not found: ID does not exist" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.619981 4813 scope.go:117] "RemoveContainer" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.620148 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} err="failed to get container status \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": rpc error: code = NotFound desc = could not find container \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": container with ID starting with c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 not found: ID does not exist" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.620162 4813 scope.go:117] "RemoveContainer" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.620323 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} err="failed to get container status \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": rpc error: code = NotFound desc = could not find container \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": container with ID starting with 58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b not found: ID does not exist" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.620336 4813 scope.go:117] "RemoveContainer" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.620533 master-0 kubenswrapper[4813]: I1203 19:54:51.620520 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.620631 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} err="failed to get container status \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": rpc error: code = NotFound desc = could not find container \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": container with ID starting with e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5 not found: ID does not exist" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.620647 4813 scope.go:117] "RemoveContainer" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.620849 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} err="failed to get container status \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": rpc error: code = NotFound desc = could not find container \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": container with ID starting with 024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638 not found: ID does not exist" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.620864 4813 scope.go:117] "RemoveContainer" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.621041 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} err="failed to get container status \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": rpc error: code = NotFound desc = could not find container \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": container with ID starting with 28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2 not found: ID does not exist" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.621055 4813 scope.go:117] "RemoveContainer" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.621261 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6" (OuterVolumeSpecName: "kube-api-access-ts8d6") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "kube-api-access-ts8d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.621276 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} err="failed to get container status \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": rpc error: code = NotFound desc = could not find container \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": container with ID starting with 18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020 not found: ID does not exist" Dec 03 19:54:51.621363 master-0 kubenswrapper[4813]: I1203 19:54:51.621324 4813 scope.go:117] "RemoveContainer" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.621956 master-0 kubenswrapper[4813]: I1203 19:54:51.621645 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564"} err="failed to get container status \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": rpc error: code = NotFound desc = could not find container \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": container with ID starting with e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564 not found: ID does not exist" Dec 03 19:54:51.621956 master-0 kubenswrapper[4813]: I1203 19:54:51.621678 4813 scope.go:117] "RemoveContainer" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.621956 master-0 kubenswrapper[4813]: I1203 19:54:51.621903 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} err="failed to get container status \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": rpc error: code = NotFound desc = could not find container \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": container with ID starting with 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 not found: ID does not exist" Dec 03 19:54:51.621956 master-0 kubenswrapper[4813]: I1203 19:54:51.621925 4813 scope.go:117] "RemoveContainer" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.622263 master-0 kubenswrapper[4813]: I1203 19:54:51.622237 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} err="failed to get container status \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": rpc error: code = NotFound desc = could not find container \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": container with ID starting with 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d not found: ID does not exist" Dec 03 19:54:51.622263 master-0 kubenswrapper[4813]: I1203 19:54:51.622251 4813 scope.go:117] "RemoveContainer" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.622579 master-0 kubenswrapper[4813]: I1203 19:54:51.622533 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} err="failed to get container status \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": rpc error: code = NotFound desc = could not find container \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": container with ID starting with c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 not found: ID does not exist" Dec 03 19:54:51.622579 master-0 kubenswrapper[4813]: I1203 19:54:51.622554 4813 scope.go:117] "RemoveContainer" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.622967 master-0 kubenswrapper[4813]: I1203 19:54:51.622855 4813 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fb5c813b-2e17-4004-bf26-a26c26a5ed8f" (UID: "fb5c813b-2e17-4004-bf26-a26c26a5ed8f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:54:51.623086 master-0 kubenswrapper[4813]: I1203 19:54:51.622991 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} err="failed to get container status \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": rpc error: code = NotFound desc = could not find container \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": container with ID starting with 58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b not found: ID does not exist" Dec 03 19:54:51.623086 master-0 kubenswrapper[4813]: I1203 19:54:51.623009 4813 scope.go:117] "RemoveContainer" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.623353 master-0 kubenswrapper[4813]: I1203 19:54:51.623310 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} err="failed to get container status \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": rpc error: code = NotFound desc = could not find container \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": container with ID starting with e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5 not found: ID does not exist" Dec 03 19:54:51.623353 master-0 kubenswrapper[4813]: I1203 19:54:51.623331 4813 scope.go:117] "RemoveContainer" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.623650 master-0 kubenswrapper[4813]: I1203 19:54:51.623589 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} err="failed to get container status \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": rpc error: code = NotFound desc = could not find container \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": container with ID starting with 024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638 not found: ID does not exist" Dec 03 19:54:51.623650 master-0 kubenswrapper[4813]: I1203 19:54:51.623603 4813 scope.go:117] "RemoveContainer" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.623926 master-0 kubenswrapper[4813]: I1203 19:54:51.623869 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} err="failed to get container status \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": rpc error: code = NotFound desc = could not find container \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": container with ID starting with 28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2 not found: ID does not exist" Dec 03 19:54:51.623926 master-0 kubenswrapper[4813]: I1203 19:54:51.623899 4813 scope.go:117] "RemoveContainer" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.624176 master-0 kubenswrapper[4813]: I1203 19:54:51.624146 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} err="failed to get container status \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": rpc error: code = NotFound desc = could not find container \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": container with ID starting with 18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020 not found: ID does not exist" Dec 03 19:54:51.624176 master-0 kubenswrapper[4813]: I1203 19:54:51.624168 4813 scope.go:117] "RemoveContainer" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.624434 master-0 kubenswrapper[4813]: I1203 19:54:51.624392 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564"} err="failed to get container status \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": rpc error: code = NotFound desc = could not find container \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": container with ID starting with e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564 not found: ID does not exist" Dec 03 19:54:51.624434 master-0 kubenswrapper[4813]: I1203 19:54:51.624423 4813 scope.go:117] "RemoveContainer" containerID="19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08" Dec 03 19:54:51.624936 master-0 kubenswrapper[4813]: I1203 19:54:51.624891 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08"} err="failed to get container status \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": rpc error: code = NotFound desc = could not find container \"19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08\": container with ID starting with 19e747f0b13749f98763ea049b945f326886f92d31f6a5cd014d9437b5f80a08 not found: ID does not exist" Dec 03 19:54:51.624936 master-0 kubenswrapper[4813]: I1203 19:54:51.624919 4813 scope.go:117] "RemoveContainer" containerID="7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d" Dec 03 19:54:51.625190 master-0 kubenswrapper[4813]: I1203 19:54:51.625145 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d"} err="failed to get container status \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": rpc error: code = NotFound desc = could not find container \"7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d\": container with ID starting with 7fa6586b9e825bdd8889af21be4e155287d6f576a52348aee2f21aa88e5d8f8d not found: ID does not exist" Dec 03 19:54:51.625190 master-0 kubenswrapper[4813]: I1203 19:54:51.625174 4813 scope.go:117] "RemoveContainer" containerID="c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04" Dec 03 19:54:51.625535 master-0 kubenswrapper[4813]: I1203 19:54:51.625494 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04"} err="failed to get container status \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": rpc error: code = NotFound desc = could not find container \"c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04\": container with ID starting with c9b705507c14aa2fa1c5f56ae9995d74a4daa9c720d9bc4ceb5d0aafeaaa5b04 not found: ID does not exist" Dec 03 19:54:51.625535 master-0 kubenswrapper[4813]: I1203 19:54:51.625519 4813 scope.go:117] "RemoveContainer" containerID="58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b" Dec 03 19:54:51.625771 master-0 kubenswrapper[4813]: I1203 19:54:51.625730 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b"} err="failed to get container status \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": rpc error: code = NotFound desc = could not find container \"58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b\": container with ID starting with 58d2f1a195f7abf31cfadd0ec4843b0a2b7fe8ee380c2cd2d7fbd1e62cf2ff3b not found: ID does not exist" Dec 03 19:54:51.625771 master-0 kubenswrapper[4813]: I1203 19:54:51.625755 4813 scope.go:117] "RemoveContainer" containerID="e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5" Dec 03 19:54:51.626149 master-0 kubenswrapper[4813]: I1203 19:54:51.626109 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5"} err="failed to get container status \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": rpc error: code = NotFound desc = could not find container \"e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5\": container with ID starting with e8ed03ff5bfb1d83997410040ca868c2faf007f3b24ac2df182c1d5b1177e6b5 not found: ID does not exist" Dec 03 19:54:51.626149 master-0 kubenswrapper[4813]: I1203 19:54:51.626128 4813 scope.go:117] "RemoveContainer" containerID="024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638" Dec 03 19:54:51.626342 master-0 kubenswrapper[4813]: I1203 19:54:51.626315 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638"} err="failed to get container status \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": rpc error: code = NotFound desc = could not find container \"024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638\": container with ID starting with 024296f166666f4c6de7a726beab4d00f510beb1bffc2226b6648d7b744c0638 not found: ID does not exist" Dec 03 19:54:51.626342 master-0 kubenswrapper[4813]: I1203 19:54:51.626335 4813 scope.go:117] "RemoveContainer" containerID="28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2" Dec 03 19:54:51.626673 master-0 kubenswrapper[4813]: I1203 19:54:51.626634 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2"} err="failed to get container status \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": rpc error: code = NotFound desc = could not find container \"28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2\": container with ID starting with 28fef41a28533a9d2d0cd10a927deeccf5d02ccf7ecc8afe84b2f200c20af2b2 not found: ID does not exist" Dec 03 19:54:51.626673 master-0 kubenswrapper[4813]: I1203 19:54:51.626653 4813 scope.go:117] "RemoveContainer" containerID="18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020" Dec 03 19:54:51.626948 master-0 kubenswrapper[4813]: I1203 19:54:51.626923 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020"} err="failed to get container status \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": rpc error: code = NotFound desc = could not find container \"18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020\": container with ID starting with 18d15da4d5a1a0d840119fcb9b2d66d9cb52343ad7bed2795a1f1f8995509020 not found: ID does not exist" Dec 03 19:54:51.626948 master-0 kubenswrapper[4813]: I1203 19:54:51.626940 4813 scope.go:117] "RemoveContainer" containerID="e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564" Dec 03 19:54:51.627189 master-0 kubenswrapper[4813]: I1203 19:54:51.627161 4813 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564"} err="failed to get container status \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": rpc error: code = NotFound desc = could not find container \"e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564\": container with ID starting with e58f624cb9204be14e44db82a27d3998105a1c91bfe11990bcca00a6015a3564 not found: ID does not exist" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715312 4813 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-systemd\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715358 4813 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-env-overrides\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715376 4813 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715396 4813 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715413 4813 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts8d6\" (UniqueName: \"kubernetes.io/projected/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-kube-api-access-ts8d6\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715410 master-0 kubenswrapper[4813]: I1203 19:54:51.715430 4813 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-kubelet\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715447 4813 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715464 4813 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715480 4813 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715496 4813 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-host-slash\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715513 4813 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:51.715915 master-0 kubenswrapper[4813]: I1203 19:54:51.715529 4813 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb5c813b-2e17-4004-bf26-a26c26a5ed8f-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Dec 03 19:54:52.023514 master-0 kubenswrapper[4813]: I1203 19:54:52.023420 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:52.023699 master-0 kubenswrapper[4813]: I1203 19:54:52.023453 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:52.023699 master-0 kubenswrapper[4813]: E1203 19:54:52.023661 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:52.023889 master-0 kubenswrapper[4813]: E1203 19:54:52.023771 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:52.724116 master-0 kubenswrapper[4813]: I1203 19:54:52.723869 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:54:52.724116 master-0 kubenswrapper[4813]: E1203 19:54:52.724107 4813 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:52.724116 master-0 kubenswrapper[4813]: E1203 19:54:52.724186 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:56.724166866 +0000 UTC m=+201.152965325 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:54:54.022985 master-0 kubenswrapper[4813]: I1203 19:54:54.022893 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:54.022985 master-0 kubenswrapper[4813]: I1203 19:54:54.022973 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:54.023640 master-0 kubenswrapper[4813]: E1203 19:54:54.023141 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:54.023640 master-0 kubenswrapper[4813]: E1203 19:54:54.023270 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:54.307504 master-0 kubenswrapper[4813]: I1203 19:54:54.307161 4813 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xghq7"] Dec 03 19:54:55.546148 master-0 kubenswrapper[4813]: I1203 19:54:55.546082 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:55.546836 master-0 kubenswrapper[4813]: E1203 19:54:55.546335 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 19:54:55.546969 master-0 kubenswrapper[4813]: E1203 19:54:55.546954 4813 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 19:54:55.547040 master-0 kubenswrapper[4813]: E1203 19:54:55.547029 4813 projected.go:194] Error preparing data for projected volume kube-api-access-lkhcw for pod openshift-network-diagnostics/network-check-target-x6vwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:55.547181 master-0 kubenswrapper[4813]: E1203 19:54:55.547157 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw podName:830d89af-1266-43ac-b113-990a28595f91 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.547131159 +0000 UTC m=+171.975929618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-lkhcw" (UniqueName: "kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw") pod "network-check-target-x6vwd" (UID: "830d89af-1266-43ac-b113-990a28595f91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 19:54:55.985006 master-0 kubenswrapper[4813]: E1203 19:54:55.984908 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:54:56.008081 master-0 kubenswrapper[4813]: I1203 19:54:56.008004 4813 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xghq7"] Dec 03 19:54:56.009482 master-0 kubenswrapper[4813]: I1203 19:54:56.009440 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l9m2r"] Dec 03 19:54:56.009830 master-0 kubenswrapper[4813]: E1203 19:54:56.009760 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="sbdb" Dec 03 19:54:56.009979 master-0 kubenswrapper[4813]: I1203 19:54:56.009957 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="sbdb" Dec 03 19:54:56.010094 master-0 kubenswrapper[4813]: E1203 19:54:56.010074 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-node" Dec 03 19:54:56.010212 master-0 kubenswrapper[4813]: I1203 19:54:56.010187 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-node" Dec 03 19:54:56.010383 master-0 kubenswrapper[4813]: E1203 19:54:56.010355 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-controller" Dec 03 19:54:56.010548 master-0 kubenswrapper[4813]: I1203 19:54:56.010519 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-controller" Dec 03 19:54:56.010729 master-0 kubenswrapper[4813]: E1203 19:54:56.010700 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-acl-logging" Dec 03 19:54:56.010908 master-0 kubenswrapper[4813]: I1203 19:54:56.010886 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-acl-logging" Dec 03 19:54:56.011020 master-0 kubenswrapper[4813]: E1203 19:54:56.011000 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:54:56.011133 master-0 kubenswrapper[4813]: I1203 19:54:56.011113 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:54:56.011235 master-0 kubenswrapper[4813]: E1203 19:54:56.011217 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="nbdb" Dec 03 19:54:56.011342 master-0 kubenswrapper[4813]: I1203 19:54:56.011324 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="nbdb" Dec 03 19:54:56.011458 master-0 kubenswrapper[4813]: E1203 19:54:56.011438 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" Dec 03 19:54:56.011566 master-0 kubenswrapper[4813]: I1203 19:54:56.011548 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" Dec 03 19:54:56.011675 master-0 kubenswrapper[4813]: E1203 19:54:56.011655 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kubecfg-setup" Dec 03 19:54:56.011803 master-0 kubenswrapper[4813]: I1203 19:54:56.011758 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kubecfg-setup" Dec 03 19:54:56.011930 master-0 kubenswrapper[4813]: E1203 19:54:56.011910 4813 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="northd" Dec 03 19:54:56.012040 master-0 kubenswrapper[4813]: I1203 19:54:56.012021 4813 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="northd" Dec 03 19:54:56.012275 master-0 kubenswrapper[4813]: I1203 19:54:56.012251 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="nbdb" Dec 03 19:54:56.012415 master-0 kubenswrapper[4813]: I1203 19:54:56.012395 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-acl-logging" Dec 03 19:54:56.012527 master-0 kubenswrapper[4813]: I1203 19:54:56.012508 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:54:56.012628 master-0 kubenswrapper[4813]: I1203 19:54:56.012610 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="sbdb" Dec 03 19:54:56.012748 master-0 kubenswrapper[4813]: I1203 19:54:56.012724 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="kube-rbac-proxy-node" Dec 03 19:54:56.012939 master-0 kubenswrapper[4813]: I1203 19:54:56.012912 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="northd" Dec 03 19:54:56.013089 master-0 kubenswrapper[4813]: I1203 19:54:56.013067 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovn-controller" Dec 03 19:54:56.013240 master-0 kubenswrapper[4813]: I1203 19:54:56.013217 4813 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" containerName="ovnkube-controller" Dec 03 19:54:56.014419 master-0 kubenswrapper[4813]: I1203 19:54:56.014385 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.017146 master-0 kubenswrapper[4813]: I1203 19:54:56.017100 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 19:54:56.017579 master-0 kubenswrapper[4813]: I1203 19:54:56.017516 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 19:54:56.023065 master-0 kubenswrapper[4813]: I1203 19:54:56.022944 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:56.023297 master-0 kubenswrapper[4813]: E1203 19:54:56.023238 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:56.023427 master-0 kubenswrapper[4813]: I1203 19:54:56.023013 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:56.023983 master-0 kubenswrapper[4813]: E1203 19:54:56.023866 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:56.033093 master-0 kubenswrapper[4813]: I1203 19:54:56.033021 4813 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5c813b-2e17-4004-bf26-a26c26a5ed8f" path="/var/lib/kubelet/pods/fb5c813b-2e17-4004-bf26-a26c26a5ed8f/volumes" Dec 03 19:54:56.051894 master-0 kubenswrapper[4813]: I1203 19:54:56.051803 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.051894 master-0 kubenswrapper[4813]: I1203 19:54:56.051886 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052167 master-0 kubenswrapper[4813]: I1203 19:54:56.051937 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052167 master-0 kubenswrapper[4813]: I1203 19:54:56.051986 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052167 master-0 kubenswrapper[4813]: I1203 19:54:56.052031 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052167 master-0 kubenswrapper[4813]: I1203 19:54:56.052079 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052167 master-0 kubenswrapper[4813]: I1203 19:54:56.052143 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052392 master-0 kubenswrapper[4813]: I1203 19:54:56.052191 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052392 master-0 kubenswrapper[4813]: I1203 19:54:56.052233 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052392 master-0 kubenswrapper[4813]: I1203 19:54:56.052276 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052392 master-0 kubenswrapper[4813]: I1203 19:54:56.052344 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052392 master-0 kubenswrapper[4813]: I1203 19:54:56.052389 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052616 master-0 kubenswrapper[4813]: I1203 19:54:56.052512 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052616 master-0 kubenswrapper[4813]: I1203 19:54:56.052602 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052732 master-0 kubenswrapper[4813]: I1203 19:54:56.052675 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052816 master-0 kubenswrapper[4813]: I1203 19:54:56.052743 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052915 master-0 kubenswrapper[4813]: I1203 19:54:56.052878 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052969 master-0 kubenswrapper[4813]: I1203 19:54:56.052922 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.052969 master-0 kubenswrapper[4813]: I1203 19:54:56.052956 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.053041 master-0 kubenswrapper[4813]: I1203 19:54:56.053006 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.153865 master-0 kubenswrapper[4813]: I1203 19:54:56.153729 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.153865 master-0 kubenswrapper[4813]: I1203 19:54:56.153867 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.153865 master-0 kubenswrapper[4813]: I1203 19:54:56.153872 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.153931 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.153964 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.153988 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154016 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154051 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154066 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154086 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154117 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154119 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154149 master-0 kubenswrapper[4813]: I1203 19:54:56.154157 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154181 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154197 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154072 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154235 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154212 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154268 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154288 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154286 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154307 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154346 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154358 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154401 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154392 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154420 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154411 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154452 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154506 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.154593 master-0 kubenswrapper[4813]: I1203 19:54:56.154555 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155108 master-0 kubenswrapper[4813]: I1203 19:54:56.154611 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155108 master-0 kubenswrapper[4813]: I1203 19:54:56.154669 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155108 master-0 kubenswrapper[4813]: I1203 19:54:56.154664 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155108 master-0 kubenswrapper[4813]: I1203 19:54:56.154756 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155108 master-0 kubenswrapper[4813]: I1203 19:54:56.155051 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.155890 master-0 kubenswrapper[4813]: I1203 19:54:56.155828 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.156182 master-0 kubenswrapper[4813]: I1203 19:54:56.156116 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:56.157726 master-0 kubenswrapper[4813]: I1203 19:54:56.157669 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:58.023467 master-0 kubenswrapper[4813]: I1203 19:54:58.023355 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:54:58.024429 master-0 kubenswrapper[4813]: I1203 19:54:58.023351 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:54:58.024429 master-0 kubenswrapper[4813]: E1203 19:54:58.023598 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:54:58.024429 master-0 kubenswrapper[4813]: E1203 19:54:58.023726 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:54:59.471273 master-0 kubenswrapper[4813]: I1203 19:54:59.471207 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:59.637566 master-0 kubenswrapper[4813]: I1203 19:54:59.637481 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:54:59.654591 master-0 kubenswrapper[4813]: W1203 19:54:59.654482 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f618ea7_3ad7_4dce_b450_a8202285f312.slice/crio-1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f WatchSource:0}: Error finding container 1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f: Status 404 returned error can't find the container with id 1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f Dec 03 19:55:00.023274 master-0 kubenswrapper[4813]: I1203 19:55:00.023087 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:00.023274 master-0 kubenswrapper[4813]: I1203 19:55:00.023161 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:00.023542 master-0 kubenswrapper[4813]: E1203 19:55:00.023337 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:00.023542 master-0 kubenswrapper[4813]: E1203 19:55:00.023463 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:00.546182 master-0 kubenswrapper[4813]: I1203 19:55:00.545987 4813 generic.go:334] "Generic (PLEG): container finished" podID="2f618ea7-3ad7-4dce-b450-a8202285f312" containerID="dddd03afbbaf28bd7aa58c27ce415ad910bb5c941f19a9c53d3832794bc71ce3" exitCode=0 Dec 03 19:55:00.546182 master-0 kubenswrapper[4813]: I1203 19:55:00.546047 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerDied","Data":"dddd03afbbaf28bd7aa58c27ce415ad910bb5c941f19a9c53d3832794bc71ce3"} Dec 03 19:55:00.546182 master-0 kubenswrapper[4813]: I1203 19:55:00.546089 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f"} Dec 03 19:55:00.987526 master-0 kubenswrapper[4813]: E1203 19:55:00.987007 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555503 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"09e467a2ad1e36f2064e816f8632f96e94fc7b92b5f4673c303ff30c789719e1"} Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555562 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"48aea2988eaecd2cc4b249f186694cb8b142216c1ca0ed76e91c3abed95360b0"} Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555581 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"d370deaa61f11441b1fb1ce39d9ba35a2f1c6f246115be3664c5ae7d9b3582c5"} Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555598 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"1882c342224067097b84048dde7df11fc797be05b0f36f1854c35956b16b0440"} Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555614 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"9743f1e9bf43f9e46260ca3e5e447cb502408f5f9ca15ad0d814b1d8f350d7a3"} Dec 03 19:55:01.555666 master-0 kubenswrapper[4813]: I1203 19:55:01.555630 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"241ba2854d52b1be075ddb34721423b9fe0a9ac41e0d57078b008ce5286a7a76"} Dec 03 19:55:02.022936 master-0 kubenswrapper[4813]: I1203 19:55:02.022889 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:02.023400 master-0 kubenswrapper[4813]: I1203 19:55:02.023014 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:02.023981 master-0 kubenswrapper[4813]: E1203 19:55:02.023532 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:02.023981 master-0 kubenswrapper[4813]: E1203 19:55:02.023337 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:03.571627 master-0 kubenswrapper[4813]: I1203 19:55:03.571534 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"291ead623ee33655a93915a1f1ddf24a3f86e798795b13de7a06dcaea8ae8e46"} Dec 03 19:55:04.023283 master-0 kubenswrapper[4813]: I1203 19:55:04.023221 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:04.023840 master-0 kubenswrapper[4813]: I1203 19:55:04.023436 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:04.024071 master-0 kubenswrapper[4813]: E1203 19:55:04.024019 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:04.024377 master-0 kubenswrapper[4813]: E1203 19:55:04.024132 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:05.582303 master-0 kubenswrapper[4813]: I1203 19:55:05.582234 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/0.log" Dec 03 19:55:05.583060 master-0 kubenswrapper[4813]: I1203 19:55:05.582330 4813 generic.go:334] "Generic (PLEG): container finished" podID="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" containerID="8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6" exitCode=1 Dec 03 19:55:05.583060 master-0 kubenswrapper[4813]: I1203 19:55:05.582385 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9sdj" event={"ID":"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6","Type":"ContainerDied","Data":"8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6"} Dec 03 19:55:05.583060 master-0 kubenswrapper[4813]: I1203 19:55:05.583053 4813 scope.go:117] "RemoveContainer" containerID="8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6" Dec 03 19:55:05.988790 master-0 kubenswrapper[4813]: E1203 19:55:05.988632 4813 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 19:55:06.023312 master-0 kubenswrapper[4813]: I1203 19:55:06.023225 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:06.023312 master-0 kubenswrapper[4813]: I1203 19:55:06.023275 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:06.025033 master-0 kubenswrapper[4813]: E1203 19:55:06.024840 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:06.025033 master-0 kubenswrapper[4813]: E1203 19:55:06.024976 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:06.589983 master-0 kubenswrapper[4813]: I1203 19:55:06.589633 4813 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/0.log" Dec 03 19:55:06.589983 master-0 kubenswrapper[4813]: I1203 19:55:06.589720 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p9sdj" event={"ID":"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6","Type":"ContainerStarted","Data":"5eb20eae0e17a7ab046f79c2ce38a11a4d5ca2ac559b0490c1a4458343dbddf2"} Dec 03 19:55:06.596753 master-0 kubenswrapper[4813]: I1203 19:55:06.596679 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" event={"ID":"2f618ea7-3ad7-4dce-b450-a8202285f312","Type":"ContainerStarted","Data":"84cb0c7e30f49d794f36b536723e448ce0f61789fa9575075cfed4853030ed51"} Dec 03 19:55:06.597132 master-0 kubenswrapper[4813]: I1203 19:55:06.597076 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:06.597132 master-0 kubenswrapper[4813]: I1203 19:55:06.597111 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:06.597132 master-0 kubenswrapper[4813]: I1203 19:55:06.597126 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:06.630117 master-0 kubenswrapper[4813]: I1203 19:55:06.630063 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:06.630629 master-0 kubenswrapper[4813]: I1203 19:55:06.630587 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:06.644969 master-0 kubenswrapper[4813]: I1203 19:55:06.644751 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" podStartSLOduration=13.644728109 podStartE2EDuration="13.644728109s" podCreationTimestamp="2025-12-03 19:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:06.64393226 +0000 UTC m=+151.072730719" watchObservedRunningTime="2025-12-03 19:55:06.644728109 +0000 UTC m=+151.073526558" Dec 03 19:55:08.023255 master-0 kubenswrapper[4813]: I1203 19:55:08.022933 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:08.023747 master-0 kubenswrapper[4813]: I1203 19:55:08.022948 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:08.023747 master-0 kubenswrapper[4813]: E1203 19:55:08.023363 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:08.023747 master-0 kubenswrapper[4813]: E1203 19:55:08.023503 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:10.023926 master-0 kubenswrapper[4813]: I1203 19:55:10.023460 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:10.023926 master-0 kubenswrapper[4813]: E1203 19:55:10.023635 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hs6gf" podUID="46b5d4d0-b841-4e87-84b4-85911ff04325" Dec 03 19:55:10.025290 master-0 kubenswrapper[4813]: I1203 19:55:10.023975 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:10.025290 master-0 kubenswrapper[4813]: E1203 19:55:10.024037 4813 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6vwd" podUID="830d89af-1266-43ac-b113-990a28595f91" Dec 03 19:55:12.023040 master-0 kubenswrapper[4813]: I1203 19:55:12.022930 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:12.023040 master-0 kubenswrapper[4813]: I1203 19:55:12.022977 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:12.026003 master-0 kubenswrapper[4813]: I1203 19:55:12.025920 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 19:55:12.026003 master-0 kubenswrapper[4813]: I1203 19:55:12.025972 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 19:55:12.026282 master-0 kubenswrapper[4813]: I1203 19:55:12.026177 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 19:55:12.903772 master-0 kubenswrapper[4813]: I1203 19:55:12.903671 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:12.904207 master-0 kubenswrapper[4813]: E1203 19:55:12.903895 4813 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:12.904207 master-0 kubenswrapper[4813]: E1203 19:55:12.903997 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:56:16.903969554 +0000 UTC m=+221.332768043 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:19.504964 master-0 kubenswrapper[4813]: I1203 19:55:19.504878 4813 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Dec 03 19:55:19.542630 master-0 kubenswrapper[4813]: I1203 19:55:19.542560 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q"] Dec 03 19:55:19.543237 master-0 kubenswrapper[4813]: I1203 19:55:19.543189 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.547442 master-0 kubenswrapper[4813]: I1203 19:55:19.546854 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 19:55:19.547442 master-0 kubenswrapper[4813]: I1203 19:55:19.547065 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 19:55:19.547442 master-0 kubenswrapper[4813]: I1203 19:55:19.547123 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 19:55:19.550689 master-0 kubenswrapper[4813]: I1203 19:55:19.549671 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5"] Dec 03 19:55:19.550689 master-0 kubenswrapper[4813]: I1203 19:55:19.550268 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.552234 master-0 kubenswrapper[4813]: I1203 19:55:19.551366 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv"] Dec 03 19:55:19.552234 master-0 kubenswrapper[4813]: I1203 19:55:19.552010 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.552624 master-0 kubenswrapper[4813]: I1203 19:55:19.552588 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.553489 master-0 kubenswrapper[4813]: I1203 19:55:19.553429 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5"] Dec 03 19:55:19.558233 master-0 kubenswrapper[4813]: I1203 19:55:19.558071 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 19:55:19.558233 master-0 kubenswrapper[4813]: I1203 19:55:19.558087 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 19:55:19.558506 master-0 kubenswrapper[4813]: W1203 19:55:19.558342 4813 reflector.go:561] object-"openshift-monitoring"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'master-0' and this object Dec 03 19:55:19.558506 master-0 kubenswrapper[4813]: W1203 19:55:19.558461 4813 reflector.go:561] object-"openshift-monitoring"/"cluster-monitoring-operator-tls": failed to list *v1.Secret: secrets "cluster-monitoring-operator-tls" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'master-0' and this object Dec 03 19:55:19.558671 master-0 kubenswrapper[4813]: E1203 19:55:19.558546 4813 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-monitoring-operator-tls\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 19:55:19.558773 master-0 kubenswrapper[4813]: E1203 19:55:19.558505 4813 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 19:55:19.560591 master-0 kubenswrapper[4813]: W1203 19:55:19.558911 4813 reflector.go:561] object-"openshift-monitoring"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'master-0' and this object Dec 03 19:55:19.560591 master-0 kubenswrapper[4813]: I1203 19:55:19.559065 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 19:55:19.560591 master-0 kubenswrapper[4813]: E1203 19:55:19.559357 4813 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 19:55:19.560919 master-0 kubenswrapper[4813]: I1203 19:55:19.560716 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.562750 master-0 kubenswrapper[4813]: W1203 19:55:19.561611 4813 reflector.go:561] object-"openshift-monitoring"/"telemetry-config": failed to list *v1.ConfigMap: configmaps "telemetry-config" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'master-0' and this object Dec 03 19:55:19.562750 master-0 kubenswrapper[4813]: E1203 19:55:19.561655 4813 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"telemetry-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"telemetry-config\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 19:55:19.562750 master-0 kubenswrapper[4813]: I1203 19:55:19.561884 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw"] Dec 03 19:55:19.562750 master-0 kubenswrapper[4813]: I1203 19:55:19.562345 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.571988 master-0 kubenswrapper[4813]: I1203 19:55:19.571890 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 19:55:19.576410 master-0 kubenswrapper[4813]: I1203 19:55:19.576351 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb"] Dec 03 19:55:19.576879 master-0 kubenswrapper[4813]: I1203 19:55:19.576831 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 19:55:19.577166 master-0 kubenswrapper[4813]: W1203 19:55:19.577099 4813 reflector.go:561] object-"openshift-service-ca-operator"/"service-ca-operator-config": failed to list *v1.ConfigMap: configmaps "service-ca-operator-config" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-service-ca-operator": no relationship found between node 'master-0' and this object Dec 03 19:55:19.577315 master-0 kubenswrapper[4813]: E1203 19:55:19.577190 4813 reflector.go:158] "Unhandled Error" err="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-operator-config\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-service-ca-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 19:55:19.577433 master-0 kubenswrapper[4813]: I1203 19:55:19.577372 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.577706 master-0 kubenswrapper[4813]: I1203 19:55:19.577665 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 19:55:19.577895 master-0 kubenswrapper[4813]: I1203 19:55:19.577868 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.578010 master-0 kubenswrapper[4813]: I1203 19:55:19.577994 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.578712 master-0 kubenswrapper[4813]: I1203 19:55:19.578674 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.579334 master-0 kubenswrapper[4813]: I1203 19:55:19.579267 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl"] Dec 03 19:55:19.579570 master-0 kubenswrapper[4813]: I1203 19:55:19.579453 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 19:55:19.580432 master-0 kubenswrapper[4813]: I1203 19:55:19.580335 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.582091 master-0 kubenswrapper[4813]: I1203 19:55:19.581120 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn"] Dec 03 19:55:19.582091 master-0 kubenswrapper[4813]: I1203 19:55:19.581705 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.588885 master-0 kubenswrapper[4813]: I1203 19:55:19.587998 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh"] Dec 03 19:55:19.588885 master-0 kubenswrapper[4813]: I1203 19:55:19.588506 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 19:55:19.589188 master-0 kubenswrapper[4813]: I1203 19:55:19.588939 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx"] Dec 03 19:55:19.589188 master-0 kubenswrapper[4813]: I1203 19:55:19.588979 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.589314 master-0 kubenswrapper[4813]: I1203 19:55:19.589229 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 19:55:19.589460 master-0 kubenswrapper[4813]: I1203 19:55:19.589408 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 19:55:19.589626 master-0 kubenswrapper[4813]: I1203 19:55:19.589598 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.589703 master-0 kubenswrapper[4813]: I1203 19:55:19.589628 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.589861 master-0 kubenswrapper[4813]: I1203 19:55:19.589813 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 19:55:19.589934 master-0 kubenswrapper[4813]: I1203 19:55:19.589596 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.590484 master-0 kubenswrapper[4813]: I1203 19:55:19.589958 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 19:55:19.590484 master-0 kubenswrapper[4813]: I1203 19:55:19.590081 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 19:55:19.590484 master-0 kubenswrapper[4813]: I1203 19:55:19.590326 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 19:55:19.590484 master-0 kubenswrapper[4813]: I1203 19:55:19.590395 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 19:55:19.590660 master-0 kubenswrapper[4813]: I1203 19:55:19.589419 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6"] Dec 03 19:55:19.591854 master-0 kubenswrapper[4813]: I1203 19:55:19.591077 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.598359 master-0 kubenswrapper[4813]: I1203 19:55:19.597857 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 19:55:19.598359 master-0 kubenswrapper[4813]: I1203 19:55:19.598301 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 19:55:19.598506 master-0 kubenswrapper[4813]: I1203 19:55:19.598473 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 19:55:19.599106 master-0 kubenswrapper[4813]: I1203 19:55:19.598616 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 19:55:19.599106 master-0 kubenswrapper[4813]: I1203 19:55:19.598842 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 19:55:19.599519 master-0 kubenswrapper[4813]: I1203 19:55:19.599338 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2"] Dec 03 19:55:19.600111 master-0 kubenswrapper[4813]: I1203 19:55:19.599764 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.600111 master-0 kubenswrapper[4813]: I1203 19:55:19.598898 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 19:55:19.600111 master-0 kubenswrapper[4813]: I1203 19:55:19.600043 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:19.603181 master-0 kubenswrapper[4813]: I1203 19:55:19.601413 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 19:55:19.603181 master-0 kubenswrapper[4813]: I1203 19:55:19.601437 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 19:55:19.603181 master-0 kubenswrapper[4813]: I1203 19:55:19.601610 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 19:55:19.603181 master-0 kubenswrapper[4813]: I1203 19:55:19.601742 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz"] Dec 03 19:55:19.603181 master-0 kubenswrapper[4813]: I1203 19:55:19.602288 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.604562 master-0 kubenswrapper[4813]: I1203 19:55:19.603572 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p"] Dec 03 19:55:19.604562 master-0 kubenswrapper[4813]: I1203 19:55:19.603974 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.609536 master-0 kubenswrapper[4813]: I1203 19:55:19.609283 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 19:55:19.609536 master-0 kubenswrapper[4813]: I1203 19:55:19.609309 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 19:55:19.609697 master-0 kubenswrapper[4813]: I1203 19:55:19.609683 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 19:55:19.610247 master-0 kubenswrapper[4813]: I1203 19:55:19.609894 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf"] Dec 03 19:55:19.614900 master-0 kubenswrapper[4813]: I1203 19:55:19.610515 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.625348 master-0 kubenswrapper[4813]: I1203 19:55:19.625270 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt"] Dec 03 19:55:19.626251 master-0 kubenswrapper[4813]: I1203 19:55:19.626210 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 19:55:19.626320 master-0 kubenswrapper[4813]: I1203 19:55:19.626281 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.629177 master-0 kubenswrapper[4813]: I1203 19:55:19.629131 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 19:55:19.630325 master-0 kubenswrapper[4813]: I1203 19:55:19.630310 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.630643 master-0 kubenswrapper[4813]: I1203 19:55:19.630601 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 19:55:19.630857 master-0 kubenswrapper[4813]: I1203 19:55:19.630838 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 19:55:19.630967 master-0 kubenswrapper[4813]: I1203 19:55:19.630948 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 19:55:19.631069 master-0 kubenswrapper[4813]: I1203 19:55:19.631038 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 19:55:19.631240 master-0 kubenswrapper[4813]: I1203 19:55:19.631190 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 19:55:19.631358 master-0 kubenswrapper[4813]: I1203 19:55:19.631336 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.631524 master-0 kubenswrapper[4813]: I1203 19:55:19.631503 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 19:55:19.631652 master-0 kubenswrapper[4813]: I1203 19:55:19.631630 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj"] Dec 03 19:55:19.632282 master-0 kubenswrapper[4813]: I1203 19:55:19.632255 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw"] Dec 03 19:55:19.632569 master-0 kubenswrapper[4813]: I1203 19:55:19.632543 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.632922 master-0 kubenswrapper[4813]: I1203 19:55:19.632888 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv"] Dec 03 19:55:19.639454 master-0 kubenswrapper[4813]: I1203 19:55:19.638186 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:19.639454 master-0 kubenswrapper[4813]: I1203 19:55:19.638994 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 19:55:19.639454 master-0 kubenswrapper[4813]: I1203 19:55:19.639132 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.639454 master-0 kubenswrapper[4813]: I1203 19:55:19.639239 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 19:55:19.639454 master-0 kubenswrapper[4813]: I1203 19:55:19.639422 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 19:55:19.645452 master-0 kubenswrapper[4813]: I1203 19:55:19.644407 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj"] Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.645769 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646016 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646239 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646362 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646397 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646524 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646586 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646616 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646670 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.646928 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 19:55:19.647130 master-0 kubenswrapper[4813]: I1203 19:55:19.647068 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.647540 master-0 kubenswrapper[4813]: I1203 19:55:19.647327 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 19:55:19.647841 master-0 kubenswrapper[4813]: I1203 19:55:19.647806 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n"] Dec 03 19:55:19.651405 master-0 kubenswrapper[4813]: I1203 19:55:19.647927 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:19.651405 master-0 kubenswrapper[4813]: I1203 19:55:19.648096 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.651405 master-0 kubenswrapper[4813]: I1203 19:55:19.648536 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 19:55:19.651405 master-0 kubenswrapper[4813]: I1203 19:55:19.648908 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 19:55:19.651405 master-0 kubenswrapper[4813]: I1203 19:55:19.651106 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q"] Dec 03 19:55:19.651633 master-0 kubenswrapper[4813]: I1203 19:55:19.651595 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.651950 master-0 kubenswrapper[4813]: I1203 19:55:19.651763 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 19:55:19.652016 master-0 kubenswrapper[4813]: I1203 19:55:19.651957 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.652218 master-0 kubenswrapper[4813]: I1203 19:55:19.652186 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 19:55:19.652543 master-0 kubenswrapper[4813]: I1203 19:55:19.652354 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 19:55:19.652739 master-0 kubenswrapper[4813]: I1203 19:55:19.652712 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 19:55:19.652930 master-0 kubenswrapper[4813]: I1203 19:55:19.652913 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 19:55:19.655393 master-0 kubenswrapper[4813]: I1203 19:55:19.653041 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 19:55:19.655393 master-0 kubenswrapper[4813]: I1203 19:55:19.653197 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 19:55:19.655393 master-0 kubenswrapper[4813]: I1203 19:55:19.653490 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 19:55:19.655393 master-0 kubenswrapper[4813]: I1203 19:55:19.653909 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 19:55:19.655393 master-0 kubenswrapper[4813]: I1203 19:55:19.654369 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5"] Dec 03 19:55:19.661090 master-0 kubenswrapper[4813]: I1203 19:55:19.661041 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw"] Dec 03 19:55:19.662311 master-0 kubenswrapper[4813]: I1203 19:55:19.662269 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 19:55:19.664758 master-0 kubenswrapper[4813]: I1203 19:55:19.663980 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5"] Dec 03 19:55:19.668168 master-0 kubenswrapper[4813]: I1203 19:55:19.666322 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv"] Dec 03 19:55:19.668168 master-0 kubenswrapper[4813]: I1203 19:55:19.666904 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn"] Dec 03 19:55:19.668168 master-0 kubenswrapper[4813]: I1203 19:55:19.668045 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6"] Dec 03 19:55:19.669671 master-0 kubenswrapper[4813]: I1203 19:55:19.669636 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz"] Dec 03 19:55:19.671324 master-0 kubenswrapper[4813]: I1203 19:55:19.671291 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx"] Dec 03 19:55:19.673452 master-0 kubenswrapper[4813]: I1203 19:55:19.673421 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p"] Dec 03 19:55:19.674369 master-0 kubenswrapper[4813]: I1203 19:55:19.674339 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv"] Dec 03 19:55:19.675771 master-0 kubenswrapper[4813]: I1203 19:55:19.675742 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-72rrb"] Dec 03 19:55:19.676358 master-0 kubenswrapper[4813]: I1203 19:55:19.676328 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.677642 master-0 kubenswrapper[4813]: I1203 19:55:19.677609 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 19:55:19.678522 master-0 kubenswrapper[4813]: I1203 19:55:19.678482 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.678574 master-0 kubenswrapper[4813]: I1203 19:55:19.678526 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.678574 master-0 kubenswrapper[4813]: I1203 19:55:19.678556 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.678638 master-0 kubenswrapper[4813]: I1203 19:55:19.678586 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.678806 master-0 kubenswrapper[4813]: I1203 19:55:19.678765 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.678850 master-0 kubenswrapper[4813]: I1203 19:55:19.678825 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.678883 master-0 kubenswrapper[4813]: I1203 19:55:19.678851 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:19.678883 master-0 kubenswrapper[4813]: I1203 19:55:19.678875 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.678933 master-0 kubenswrapper[4813]: I1203 19:55:19.678902 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.679023 master-0 kubenswrapper[4813]: I1203 19:55:19.678983 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.679121 master-0 kubenswrapper[4813]: I1203 19:55:19.679098 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:19.679162 master-0 kubenswrapper[4813]: I1203 19:55:19.679132 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.679223 master-0 kubenswrapper[4813]: I1203 19:55:19.679159 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.679223 master-0 kubenswrapper[4813]: I1203 19:55:19.679206 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.679292 master-0 kubenswrapper[4813]: I1203 19:55:19.679231 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.679404 master-0 kubenswrapper[4813]: I1203 19:55:19.679379 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.679442 master-0 kubenswrapper[4813]: I1203 19:55:19.679416 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.679480 master-0 kubenswrapper[4813]: I1203 19:55:19.679463 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.679514 master-0 kubenswrapper[4813]: I1203 19:55:19.679498 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:19.679543 master-0 kubenswrapper[4813]: I1203 19:55:19.679522 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:19.679574 master-0 kubenswrapper[4813]: I1203 19:55:19.679548 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.679610 master-0 kubenswrapper[4813]: I1203 19:55:19.679576 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.679610 master-0 kubenswrapper[4813]: I1203 19:55:19.679601 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.679662 master-0 kubenswrapper[4813]: I1203 19:55:19.679632 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.679694 master-0 kubenswrapper[4813]: I1203 19:55:19.679661 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.679735 master-0 kubenswrapper[4813]: I1203 19:55:19.679716 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.679801 master-0 kubenswrapper[4813]: I1203 19:55:19.679764 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.679847 master-0 kubenswrapper[4813]: I1203 19:55:19.679814 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.679911 master-0 kubenswrapper[4813]: I1203 19:55:19.679886 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.680038 master-0 kubenswrapper[4813]: I1203 19:55:19.680004 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.680109 master-0 kubenswrapper[4813]: I1203 19:55:19.680085 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.680109 master-0 kubenswrapper[4813]: I1203 19:55:19.680097 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh"] Dec 03 19:55:19.680859 master-0 kubenswrapper[4813]: I1203 19:55:19.680119 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.680921 master-0 kubenswrapper[4813]: I1203 19:55:19.680889 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.680921 master-0 kubenswrapper[4813]: I1203 19:55:19.680912 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.681052 master-0 kubenswrapper[4813]: I1203 19:55:19.680997 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.681098 master-0 kubenswrapper[4813]: I1203 19:55:19.681059 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.681098 master-0 kubenswrapper[4813]: I1203 19:55:19.681083 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.681177 master-0 kubenswrapper[4813]: I1203 19:55:19.681111 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.681177 master-0 kubenswrapper[4813]: I1203 19:55:19.681159 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.681258 master-0 kubenswrapper[4813]: I1203 19:55:19.681184 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.681258 master-0 kubenswrapper[4813]: I1203 19:55:19.681227 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.681340 master-0 kubenswrapper[4813]: I1203 19:55:19.681250 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.681340 master-0 kubenswrapper[4813]: I1203 19:55:19.681324 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.681413 master-0 kubenswrapper[4813]: I1203 19:55:19.681349 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.681413 master-0 kubenswrapper[4813]: I1203 19:55:19.681374 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.681413 master-0 kubenswrapper[4813]: I1203 19:55:19.681402 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.681534 master-0 kubenswrapper[4813]: I1203 19:55:19.681422 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.681534 master-0 kubenswrapper[4813]: I1203 19:55:19.681444 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.681534 master-0 kubenswrapper[4813]: I1203 19:55:19.681490 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.681800 master-0 kubenswrapper[4813]: I1203 19:55:19.681706 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.681857 master-0 kubenswrapper[4813]: I1203 19:55:19.681832 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.681888 master-0 kubenswrapper[4813]: I1203 19:55:19.681859 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.681916 master-0 kubenswrapper[4813]: I1203 19:55:19.681888 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.681916 master-0 kubenswrapper[4813]: I1203 19:55:19.681909 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.681970 master-0 kubenswrapper[4813]: I1203 19:55:19.681927 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.681970 master-0 kubenswrapper[4813]: I1203 19:55:19.681946 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.682022 master-0 kubenswrapper[4813]: I1203 19:55:19.681963 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.682064 master-0 kubenswrapper[4813]: I1203 19:55:19.682044 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.682095 master-0 kubenswrapper[4813]: I1203 19:55:19.682067 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.682095 master-0 kubenswrapper[4813]: I1203 19:55:19.682083 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.682156 master-0 kubenswrapper[4813]: I1203 19:55:19.682102 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.682156 master-0 kubenswrapper[4813]: I1203 19:55:19.682119 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.682156 master-0 kubenswrapper[4813]: I1203 19:55:19.682140 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.682156 master-0 kubenswrapper[4813]: I1203 19:55:19.682154 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: I1203 19:55:19.682184 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: I1203 19:55:19.682206 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: I1203 19:55:19.682224 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: I1203 19:55:19.682241 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: I1203 19:55:19.682256 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: E1203 19:55:19.682422 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:19.682478 master-0 kubenswrapper[4813]: E1203 19:55:19.682466 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.182451909 +0000 UTC m=+164.611250348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:19.683020 master-0 kubenswrapper[4813]: I1203 19:55:19.682979 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw"] Dec 03 19:55:19.684687 master-0 kubenswrapper[4813]: I1203 19:55:19.684611 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt"] Dec 03 19:55:19.690311 master-0 kubenswrapper[4813]: I1203 19:55:19.690197 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj"] Dec 03 19:55:19.694121 master-0 kubenswrapper[4813]: I1203 19:55:19.694086 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb"] Dec 03 19:55:19.698250 master-0 kubenswrapper[4813]: I1203 19:55:19.698227 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf"] Dec 03 19:55:19.700521 master-0 kubenswrapper[4813]: I1203 19:55:19.700283 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2"] Dec 03 19:55:19.701035 master-0 kubenswrapper[4813]: I1203 19:55:19.700859 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 19:55:19.702932 master-0 kubenswrapper[4813]: I1203 19:55:19.702880 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl"] Dec 03 19:55:19.705149 master-0 kubenswrapper[4813]: I1203 19:55:19.705111 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n"] Dec 03 19:55:19.707257 master-0 kubenswrapper[4813]: I1203 19:55:19.706712 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj"] Dec 03 19:55:19.707672 master-0 kubenswrapper[4813]: I1203 19:55:19.707577 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783075 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783119 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783139 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783155 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783186 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783210 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783233 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783257 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783281 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783303 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783332 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783358 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783383 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783411 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.783506 master-0 kubenswrapper[4813]: I1203 19:55:19.783440 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783468 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783493 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783518 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783548 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783572 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783599 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783626 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783649 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783653 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783672 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783701 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783726 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783750 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783798 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.784384 master-0 kubenswrapper[4813]: I1203 19:55:19.783822 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783839 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783860 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783891 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783915 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783933 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783958 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.783983 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.784007 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.784024 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: I1203 19:55:19.784063 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: E1203 19:55:19.784079 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: E1203 19:55:19.784156 4813 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: E1203 19:55:19.784245 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.284168353 +0000 UTC m=+164.712966802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:19.784965 master-0 kubenswrapper[4813]: E1203 19:55:19.784266 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.284257205 +0000 UTC m=+164.713055654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785173 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785188 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785229 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: E1203 19:55:19.785267 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: E1203 19:55:19.785315 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.285299281 +0000 UTC m=+164.714097850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.784093 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785362 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785391 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785440 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.785489 master-0 kubenswrapper[4813]: I1203 19:55:19.785467 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785499 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785530 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785556 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785588 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785617 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785642 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785670 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785689 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785697 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785733 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785762 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785815 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785852 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785873 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.785886 master-0 kubenswrapper[4813]: I1203 19:55:19.785896 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: I1203 19:55:19.785920 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: I1203 19:55:19.785944 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: I1203 19:55:19.785984 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: E1203 19:55:19.786429 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: E1203 19:55:19.786464 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.286452591 +0000 UTC m=+164.715251110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:19.786506 master-0 kubenswrapper[4813]: I1203 19:55:19.786498 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786531 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786556 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786591 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786618 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786642 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786678 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786704 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.786734 master-0 kubenswrapper[4813]: I1203 19:55:19.786732 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.787089 master-0 kubenswrapper[4813]: I1203 19:55:19.786951 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.787793 master-0 kubenswrapper[4813]: I1203 19:55:19.787373 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.787793 master-0 kubenswrapper[4813]: E1203 19:55:19.787469 4813 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:19.787793 master-0 kubenswrapper[4813]: E1203 19:55:19.787734 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.287711513 +0000 UTC m=+164.716510002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:19.787944 master-0 kubenswrapper[4813]: E1203 19:55:19.787844 4813 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:19.787944 master-0 kubenswrapper[4813]: E1203 19:55:19.787905 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.287882927 +0000 UTC m=+164.716681516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:19.788353 master-0 kubenswrapper[4813]: I1203 19:55:19.788311 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.788839 master-0 kubenswrapper[4813]: I1203 19:55:19.788747 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.789345 master-0 kubenswrapper[4813]: I1203 19:55:19.789038 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.789345 master-0 kubenswrapper[4813]: I1203 19:55:19.789194 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.789771 master-0 kubenswrapper[4813]: I1203 19:55:19.789735 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.789965 master-0 kubenswrapper[4813]: E1203 19:55:19.789900 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:19.789965 master-0 kubenswrapper[4813]: E1203 19:55:19.789950 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.28993389 +0000 UTC m=+164.718732389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:19.790629 master-0 kubenswrapper[4813]: I1203 19:55:19.790592 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.790681 master-0 kubenswrapper[4813]: I1203 19:55:19.790626 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.792876 master-0 kubenswrapper[4813]: I1203 19:55:19.792560 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.792876 master-0 kubenswrapper[4813]: I1203 19:55:19.792558 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.792876 master-0 kubenswrapper[4813]: I1203 19:55:19.792647 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.792876 master-0 kubenswrapper[4813]: I1203 19:55:19.792704 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: I1203 19:55:19.793011 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: E1203 19:55:19.793082 4813 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: E1203 19:55:19.793125 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.29311105 +0000 UTC m=+164.721909609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: I1203 19:55:19.793172 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: E1203 19:55:19.793180 4813 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: I1203 19:55:19.793249 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: E1203 19:55:19.793258 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:20.293245413 +0000 UTC m=+164.722044002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:19.793359 master-0 kubenswrapper[4813]: I1203 19:55:19.793358 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.793636 master-0 kubenswrapper[4813]: I1203 19:55:19.793596 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:19.793636 master-0 kubenswrapper[4813]: I1203 19:55:19.793611 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.793748 master-0 kubenswrapper[4813]: I1203 19:55:19.793722 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.793801 master-0 kubenswrapper[4813]: I1203 19:55:19.793746 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:19.794350 master-0 kubenswrapper[4813]: I1203 19:55:19.794317 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.794980 master-0 kubenswrapper[4813]: I1203 19:55:19.794950 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:19.795322 master-0 kubenswrapper[4813]: I1203 19:55:19.795288 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:19.796251 master-0 kubenswrapper[4813]: I1203 19:55:19.796233 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.800862 master-0 kubenswrapper[4813]: I1203 19:55:19.797328 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.804831 master-0 kubenswrapper[4813]: I1203 19:55:19.801749 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:19.804831 master-0 kubenswrapper[4813]: I1203 19:55:19.801827 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:19.804831 master-0 kubenswrapper[4813]: I1203 19:55:19.802003 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.804831 master-0 kubenswrapper[4813]: I1203 19:55:19.802520 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:19.806360 master-0 kubenswrapper[4813]: I1203 19:55:19.806327 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:19.811051 master-0 kubenswrapper[4813]: I1203 19:55:19.809376 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:19.812052 master-0 kubenswrapper[4813]: I1203 19:55:19.812017 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:19.820369 master-0 kubenswrapper[4813]: I1203 19:55:19.820315 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:19.837856 master-0 kubenswrapper[4813]: I1203 19:55:19.837827 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:19.863455 master-0 kubenswrapper[4813]: I1203 19:55:19.863425 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:19.878739 master-0 kubenswrapper[4813]: I1203 19:55:19.878707 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:19.887971 master-0 kubenswrapper[4813]: I1203 19:55:19.887922 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.888445 master-0 kubenswrapper[4813]: I1203 19:55:19.888390 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.888550 master-0 kubenswrapper[4813]: I1203 19:55:19.888518 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.888624 master-0 kubenswrapper[4813]: I1203 19:55:19.888573 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.889126 master-0 kubenswrapper[4813]: I1203 19:55:19.889087 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:19.901560 master-0 kubenswrapper[4813]: I1203 19:55:19.901507 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:19.918336 master-0 kubenswrapper[4813]: I1203 19:55:19.918279 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:19.938462 master-0 kubenswrapper[4813]: I1203 19:55:19.938392 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:19.978911 master-0 kubenswrapper[4813]: I1203 19:55:19.978829 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:19.993179 master-0 kubenswrapper[4813]: I1203 19:55:19.993099 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:20.002427 master-0 kubenswrapper[4813]: I1203 19:55:20.002364 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:20.011822 master-0 kubenswrapper[4813]: I1203 19:55:20.011409 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:20.020169 master-0 kubenswrapper[4813]: I1203 19:55:20.020106 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:20.041189 master-0 kubenswrapper[4813]: I1203 19:55:20.041136 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:20.045387 master-0 kubenswrapper[4813]: I1203 19:55:20.045329 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:20.060709 master-0 kubenswrapper[4813]: I1203 19:55:20.060629 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:20.061893 master-0 kubenswrapper[4813]: I1203 19:55:20.061749 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:20.069564 master-0 kubenswrapper[4813]: I1203 19:55:20.069003 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:20.076904 master-0 kubenswrapper[4813]: I1203 19:55:20.076860 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:20.085347 master-0 kubenswrapper[4813]: I1203 19:55:20.085282 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:20.091768 master-0 kubenswrapper[4813]: I1203 19:55:20.091695 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:20.102826 master-0 kubenswrapper[4813]: I1203 19:55:20.102041 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:20.129272 master-0 kubenswrapper[4813]: I1203 19:55:20.127415 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:20.143960 master-0 kubenswrapper[4813]: I1203 19:55:20.139105 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:20.161365 master-0 kubenswrapper[4813]: I1203 19:55:20.160980 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:20.169516 master-0 kubenswrapper[4813]: I1203 19:55:20.169469 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:20.180912 master-0 kubenswrapper[4813]: I1203 19:55:20.180871 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:20.192843 master-0 kubenswrapper[4813]: I1203 19:55:20.192771 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:20.192962 master-0 kubenswrapper[4813]: E1203 19:55:20.192938 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:20.192998 master-0 kubenswrapper[4813]: E1203 19:55:20.192989 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.192973937 +0000 UTC m=+165.621772386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:20.242265 master-0 kubenswrapper[4813]: I1203 19:55:20.235425 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:20.290410 master-0 kubenswrapper[4813]: I1203 19:55:20.290027 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:20.294335 master-0 kubenswrapper[4813]: I1203 19:55:20.294199 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:20.294335 master-0 kubenswrapper[4813]: I1203 19:55:20.294265 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:20.294335 master-0 kubenswrapper[4813]: I1203 19:55:20.294328 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:20.294475 master-0 kubenswrapper[4813]: I1203 19:55:20.294410 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:20.294475 master-0 kubenswrapper[4813]: I1203 19:55:20.294438 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:20.294475 master-0 kubenswrapper[4813]: I1203 19:55:20.294465 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:20.294558 master-0 kubenswrapper[4813]: I1203 19:55:20.294487 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:20.294558 master-0 kubenswrapper[4813]: I1203 19:55:20.294537 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:20.294611 master-0 kubenswrapper[4813]: E1203 19:55:20.294544 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:20.294965 master-0 kubenswrapper[4813]: E1203 19:55:20.294924 4813 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:20.295025 master-0 kubenswrapper[4813]: E1203 19:55:20.294994 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.294938946 +0000 UTC m=+165.723737395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.295103 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295091251 +0000 UTC m=+165.723889700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.295025 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: I1203 19:55:20.294561 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.295176 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295169203 +0000 UTC m=+165.723967652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.294707 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.295251 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295243675 +0000 UTC m=+165.724042264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:20.295268 master-0 kubenswrapper[4813]: E1203 19:55:20.294745 4813 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:20.295472 master-0 kubenswrapper[4813]: E1203 19:55:20.295282 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295276625 +0000 UTC m=+165.724075224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:20.295472 master-0 kubenswrapper[4813]: E1203 19:55:20.294799 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:20.295535 master-0 kubenswrapper[4813]: E1203 19:55:20.295484 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.29547305 +0000 UTC m=+165.724271499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:20.295535 master-0 kubenswrapper[4813]: E1203 19:55:20.294840 4813 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:20.295535 master-0 kubenswrapper[4813]: E1203 19:55:20.294878 4813 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:20.295618 master-0 kubenswrapper[4813]: E1203 19:55:20.295542 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295522311 +0000 UTC m=+165.724320860 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:20.295618 master-0 kubenswrapper[4813]: E1203 19:55:20.295580 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295560832 +0000 UTC m=+165.724359281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:20.295618 master-0 kubenswrapper[4813]: E1203 19:55:20.294664 4813 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:20.295618 master-0 kubenswrapper[4813]: E1203 19:55:20.295608 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.295602343 +0000 UTC m=+165.724400782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:20.336592 master-0 kubenswrapper[4813]: I1203 19:55:20.333502 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn"] Dec 03 19:55:20.381971 master-0 kubenswrapper[4813]: I1203 19:55:20.381904 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh"] Dec 03 19:55:20.396050 master-0 kubenswrapper[4813]: I1203 19:55:20.396012 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx"] Dec 03 19:55:20.400617 master-0 kubenswrapper[4813]: W1203 19:55:20.400542 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3ee9a2_0f17_4a04_9191_b60684ef6c29.slice/crio-49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1 WatchSource:0}: Error finding container 49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1: Status 404 returned error can't find the container with id 49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1 Dec 03 19:55:20.427300 master-0 kubenswrapper[4813]: I1203 19:55:20.427251 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 19:55:20.461595 master-0 kubenswrapper[4813]: I1203 19:55:20.461554 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw"] Dec 03 19:55:20.466664 master-0 kubenswrapper[4813]: I1203 19:55:20.464727 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv"] Dec 03 19:55:20.466664 master-0 kubenswrapper[4813]: I1203 19:55:20.466341 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:20.472654 master-0 kubenswrapper[4813]: W1203 19:55:20.472622 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84835e3_e8bc_4aa4_a8f3_f9be702a358a.slice/crio-ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40 WatchSource:0}: Error finding container ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40: Status 404 returned error can't find the container with id ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40 Dec 03 19:55:20.476238 master-0 kubenswrapper[4813]: W1203 19:55:20.476218 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac1ae27_c34b_4bab_9f60_b2e2f9ad18b9.slice/crio-f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9 WatchSource:0}: Error finding container f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9: Status 404 returned error can't find the container with id f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9 Dec 03 19:55:20.490918 master-0 kubenswrapper[4813]: I1203 19:55:20.490877 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5"] Dec 03 19:55:20.503351 master-0 kubenswrapper[4813]: I1203 19:55:20.503308 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl"] Dec 03 19:55:20.515042 master-0 kubenswrapper[4813]: W1203 19:55:20.515004 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f99422_7991_40ef_92a1_de2e603e47b9.slice/crio-38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790 WatchSource:0}: Error finding container 38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790: Status 404 returned error can't find the container with id 38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790 Dec 03 19:55:20.567216 master-0 kubenswrapper[4813]: I1203 19:55:20.566119 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 19:55:20.573354 master-0 kubenswrapper[4813]: E1203 19:55:20.573300 4813 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:20.573410 master-0 kubenswrapper[4813]: E1203 19:55:20.573401 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:21.073380549 +0000 UTC m=+165.502178998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:20.579588 master-0 kubenswrapper[4813]: I1203 19:55:20.579547 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt"] Dec 03 19:55:20.589430 master-0 kubenswrapper[4813]: I1203 19:55:20.589392 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz"] Dec 03 19:55:20.590379 master-0 kubenswrapper[4813]: W1203 19:55:20.590275 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd28fbd98_2f67_42f5_9e06_b2e27a4b2f4f.slice/crio-e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22 WatchSource:0}: Error finding container e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22: Status 404 returned error can't find the container with id e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22 Dec 03 19:55:20.596503 master-0 kubenswrapper[4813]: I1203 19:55:20.596464 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf"] Dec 03 19:55:20.597136 master-0 kubenswrapper[4813]: I1203 19:55:20.597087 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj"] Dec 03 19:55:20.599470 master-0 kubenswrapper[4813]: W1203 19:55:20.599423 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda185ee17_4b4b_4d20_a8ed_56a2a01f1807.slice/crio-a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7 WatchSource:0}: Error finding container a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7: Status 404 returned error can't find the container with id a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7 Dec 03 19:55:20.608832 master-0 kubenswrapper[4813]: W1203 19:55:20.608751 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a864f2_934f_4197_9753_24c9bc7f1fca.slice/crio-b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575 WatchSource:0}: Error finding container b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575: Status 404 returned error can't find the container with id b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575 Dec 03 19:55:20.635796 master-0 kubenswrapper[4813]: I1203 19:55:20.635741 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 19:55:20.643804 master-0 kubenswrapper[4813]: I1203 19:55:20.643760 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 19:55:20.645150 master-0 kubenswrapper[4813]: I1203 19:55:20.645122 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:20.655248 master-0 kubenswrapper[4813]: I1203 19:55:20.655206 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:20.663332 master-0 kubenswrapper[4813]: I1203 19:55:20.663294 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7"} Dec 03 19:55:20.664176 master-0 kubenswrapper[4813]: I1203 19:55:20.664145 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerStarted","Data":"38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790"} Dec 03 19:55:20.665155 master-0 kubenswrapper[4813]: I1203 19:55:20.665130 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55"} Dec 03 19:55:20.666183 master-0 kubenswrapper[4813]: I1203 19:55:20.666151 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365"} Dec 03 19:55:20.667213 master-0 kubenswrapper[4813]: I1203 19:55:20.667157 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerStarted","Data":"ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40"} Dec 03 19:55:20.668024 master-0 kubenswrapper[4813]: I1203 19:55:20.667975 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575"} Dec 03 19:55:20.668790 master-0 kubenswrapper[4813]: I1203 19:55:20.668740 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9"} Dec 03 19:55:20.669948 master-0 kubenswrapper[4813]: I1203 19:55:20.669913 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22"} Dec 03 19:55:20.671063 master-0 kubenswrapper[4813]: I1203 19:55:20.671035 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"23b4f3f34e8595251e0fdeffba36a81024e5f343e733b49e23a5e472d12bfa81"} Dec 03 19:55:20.671063 master-0 kubenswrapper[4813]: I1203 19:55:20.671062 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9"} Dec 03 19:55:20.671926 master-0 kubenswrapper[4813]: I1203 19:55:20.671877 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-72rrb" event={"ID":"371917da-b783-4acc-81af-1cfc903269f4","Type":"ContainerStarted","Data":"8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091"} Dec 03 19:55:20.673291 master-0 kubenswrapper[4813]: I1203 19:55:20.673126 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1"} Dec 03 19:55:20.674168 master-0 kubenswrapper[4813]: I1203 19:55:20.674136 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7"} Dec 03 19:55:20.685674 master-0 kubenswrapper[4813]: I1203 19:55:20.685598 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podStartSLOduration=121.685562048 podStartE2EDuration="2m1.685562048s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:20.683802683 +0000 UTC m=+165.112601132" watchObservedRunningTime="2025-12-03 19:55:20.685562048 +0000 UTC m=+165.114360497" Dec 03 19:55:20.770675 master-0 kubenswrapper[4813]: I1203 19:55:20.770616 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 19:55:20.775700 master-0 kubenswrapper[4813]: I1203 19:55:20.775658 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:20.861894 master-0 kubenswrapper[4813]: I1203 19:55:20.861425 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:21.092279 master-0 kubenswrapper[4813]: I1203 19:55:21.092235 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5"] Dec 03 19:55:21.100809 master-0 kubenswrapper[4813]: W1203 19:55:21.100761 4813 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d51d9a_9beb_4357_9dc2_aeac210cd0c4.slice/crio-db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a WatchSource:0}: Error finding container db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a: Status 404 returned error can't find the container with id db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a Dec 03 19:55:21.104399 master-0 kubenswrapper[4813]: I1203 19:55:21.104246 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:21.104512 master-0 kubenswrapper[4813]: E1203 19:55:21.104447 4813 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:21.104745 master-0 kubenswrapper[4813]: E1203 19:55:21.104535 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:22.10451277 +0000 UTC m=+166.533311229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:21.205175 master-0 kubenswrapper[4813]: I1203 19:55:21.205006 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:21.205343 master-0 kubenswrapper[4813]: E1203 19:55:21.205265 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:21.205343 master-0 kubenswrapper[4813]: E1203 19:55:21.205324 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.20530609 +0000 UTC m=+167.634104539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:21.306357 master-0 kubenswrapper[4813]: I1203 19:55:21.306303 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306372 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306411 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306455 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306482 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306510 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306531 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306570 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:21.306631 master-0 kubenswrapper[4813]: I1203 19:55:21.306591 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:21.306864 master-0 kubenswrapper[4813]: E1203 19:55:21.306726 4813 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:21.306864 master-0 kubenswrapper[4813]: E1203 19:55:21.306794 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.306762747 +0000 UTC m=+167.735561196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.306989 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307025 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307015523 +0000 UTC m=+167.735813972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307062 4813 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307081 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307074764 +0000 UTC m=+167.735873213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307109 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307134 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307121306 +0000 UTC m=+167.735919745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307139 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307165 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307189 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307184447 +0000 UTC m=+167.735982896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307220 4813 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307242 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307217638 +0000 UTC m=+167.736016157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307257 4813 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307266 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307256639 +0000 UTC m=+167.736055188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307278 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307272879 +0000 UTC m=+167.736071328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:21.307372 master-0 kubenswrapper[4813]: E1203 19:55:21.307332 4813 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:21.307785 master-0 kubenswrapper[4813]: E1203 19:55:21.307359 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:23.307351922 +0000 UTC m=+167.736150471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:21.681170 master-0 kubenswrapper[4813]: I1203 19:55:21.681104 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a"} Dec 03 19:55:22.116827 master-0 kubenswrapper[4813]: I1203 19:55:22.116450 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:22.117022 master-0 kubenswrapper[4813]: E1203 19:55:22.116877 4813 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:22.117022 master-0 kubenswrapper[4813]: E1203 19:55:22.116985 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:24.116962586 +0000 UTC m=+168.545761105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:23.228143 master-0 kubenswrapper[4813]: I1203 19:55:23.228073 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:23.228822 master-0 kubenswrapper[4813]: E1203 19:55:23.228290 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:23.228822 master-0 kubenswrapper[4813]: E1203 19:55:23.228365 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.228347176 +0000 UTC m=+171.657145625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:23.329615 master-0 kubenswrapper[4813]: I1203 19:55:23.329534 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:23.329818 master-0 kubenswrapper[4813]: I1203 19:55:23.329655 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:23.329818 master-0 kubenswrapper[4813]: I1203 19:55:23.329717 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:23.329818 master-0 kubenswrapper[4813]: E1203 19:55:23.329751 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:23.329818 master-0 kubenswrapper[4813]: E1203 19:55:23.329799 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: E1203 19:55:23.329847 4813 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: I1203 19:55:23.329764 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: E1203 19:55:23.329857 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.329835033 +0000 UTC m=+171.758633552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: E1203 19:55:23.329905 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.329887824 +0000 UTC m=+171.758686273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: E1203 19:55:23.329917 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.329912085 +0000 UTC m=+171.758710524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: I1203 19:55:23.329947 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: E1203 19:55:23.329953 4813 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: I1203 19:55:23.329969 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:23.330001 master-0 kubenswrapper[4813]: I1203 19:55:23.330003 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330046 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330016517 +0000 UTC m=+171.758815016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: I1203 19:55:23.330123 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330160 4813 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: I1203 19:55:23.330195 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330204 4813 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330252 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330229413 +0000 UTC m=+171.759027912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330294 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330324 4813 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:23.330336 master-0 kubenswrapper[4813]: E1203 19:55:23.330052 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:23.330725 master-0 kubenswrapper[4813]: E1203 19:55:23.330332 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330293154 +0000 UTC m=+171.759091643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:23.330725 master-0 kubenswrapper[4813]: E1203 19:55:23.330532 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330519821 +0000 UTC m=+171.759318350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:23.330725 master-0 kubenswrapper[4813]: E1203 19:55:23.330549 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330540221 +0000 UTC m=+171.759338780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:23.330725 master-0 kubenswrapper[4813]: E1203 19:55:23.330566 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:27.330557692 +0000 UTC m=+171.759356281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:24.141087 master-0 kubenswrapper[4813]: I1203 19:55:24.141030 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:24.141302 master-0 kubenswrapper[4813]: E1203 19:55:24.141187 4813 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:24.141302 master-0 kubenswrapper[4813]: E1203 19:55:24.141279 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:28.141260404 +0000 UTC m=+172.570058853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:27.271419 master-0 kubenswrapper[4813]: I1203 19:55:27.270870 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:27.271419 master-0 kubenswrapper[4813]: E1203 19:55:27.271069 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:27.271419 master-0 kubenswrapper[4813]: E1203 19:55:27.271113 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.271100571 +0000 UTC m=+179.699899010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371302 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371358 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371387 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371412 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371454 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371475 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371511 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371544 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: I1203 19:55:27.371579 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.371704 4813 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.371758 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.371741018 +0000 UTC m=+179.800539457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.371958 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.371988 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.371979703 +0000 UTC m=+179.800778152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.372029 4813 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.372055 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372047005 +0000 UTC m=+179.800845454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:27.372718 master-0 kubenswrapper[4813]: E1203 19:55:27.372341 4813 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372373 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372363443 +0000 UTC m=+179.801161892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372422 4813 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372445 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372437435 +0000 UTC m=+179.801235884 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372482 4813 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372505 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372497816 +0000 UTC m=+179.801296265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372545 4813 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372565 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372558408 +0000 UTC m=+179.801356857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372602 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372627 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372619669 +0000 UTC m=+179.801418118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372665 4813 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:27.373464 master-0 kubenswrapper[4813]: E1203 19:55:27.372686 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:35.372678851 +0000 UTC m=+179.801477300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:27.574199 master-0 kubenswrapper[4813]: I1203 19:55:27.573450 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:27.577962 master-0 kubenswrapper[4813]: I1203 19:55:27.577930 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:27.659344 master-0 kubenswrapper[4813]: I1203 19:55:27.658931 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:28.181135 master-0 kubenswrapper[4813]: I1203 19:55:28.180240 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:28.181135 master-0 kubenswrapper[4813]: E1203 19:55:28.180507 4813 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:28.181135 master-0 kubenswrapper[4813]: E1203 19:55:28.180585 4813 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.180563061 +0000 UTC m=+180.609361510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:28.923495 master-0 kubenswrapper[4813]: I1203 19:55:28.922800 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x6vwd"] Dec 03 19:55:29.657845 master-0 kubenswrapper[4813]: I1203 19:55:29.657275 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:29.708829 master-0 kubenswrapper[4813]: I1203 19:55:29.708539 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"8ee6a0b56a85c0d14ad54d2283fc55b5a9f7a55c73d41cd24b0430be03f47449"} Dec 03 19:55:29.717884 master-0 kubenswrapper[4813]: I1203 19:55:29.717578 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"67df0016b48dcce14201ac3044aca405e44a73dd4f2748c38de589d5302c6d89"} Dec 03 19:55:29.723416 master-0 kubenswrapper[4813]: I1203 19:55:29.723099 4813 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="ebce70450136604f7c52ead6ab27edb4126b2802849c71ec6e71d90ddadab566" exitCode=0 Dec 03 19:55:29.723416 master-0 kubenswrapper[4813]: I1203 19:55:29.723163 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"ebce70450136604f7c52ead6ab27edb4126b2802849c71ec6e71d90ddadab566"} Dec 03 19:55:29.728266 master-0 kubenswrapper[4813]: I1203 19:55:29.728205 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podStartSLOduration=124.660587744 podStartE2EDuration="2m12.728185161s" podCreationTimestamp="2025-12-03 19:53:17 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.603388601 +0000 UTC m=+165.032187040" lastFinishedPulling="2025-12-03 19:55:28.670986008 +0000 UTC m=+173.099784457" observedRunningTime="2025-12-03 19:55:29.727258057 +0000 UTC m=+174.156056516" watchObservedRunningTime="2025-12-03 19:55:29.728185161 +0000 UTC m=+174.156983610" Dec 03 19:55:29.733833 master-0 kubenswrapper[4813]: I1203 19:55:29.733748 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x6vwd" event={"ID":"830d89af-1266-43ac-b113-990a28595f91","Type":"ContainerStarted","Data":"808ed291ef274c77392498b5d659ab97ef12ef9411a6d5fc4d34b085854c3c44"} Dec 03 19:55:29.733833 master-0 kubenswrapper[4813]: I1203 19:55:29.733793 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x6vwd" event={"ID":"830d89af-1266-43ac-b113-990a28595f91","Type":"ContainerStarted","Data":"15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca"} Dec 03 19:55:29.734334 master-0 kubenswrapper[4813]: I1203 19:55:29.734276 4813 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:29.746888 master-0 kubenswrapper[4813]: I1203 19:55:29.746835 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"5368f3d8c609d03f47b3a2379952daea482ac8f810b561b93821ae543a16d61e"} Dec 03 19:55:29.749641 master-0 kubenswrapper[4813]: I1203 19:55:29.749601 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"794beba2362386c338599c102e787bfbcb667a8f297d93f341ccc297bdb73087"} Dec 03 19:55:29.756565 master-0 kubenswrapper[4813]: I1203 19:55:29.756024 4813 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="431c55fff96bdc81a72543ef7c8b4286f0ecf12b7dc9b0a56daf54373c4eef86" exitCode=0 Dec 03 19:55:29.756565 master-0 kubenswrapper[4813]: I1203 19:55:29.756121 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerDied","Data":"431c55fff96bdc81a72543ef7c8b4286f0ecf12b7dc9b0a56daf54373c4eef86"} Dec 03 19:55:29.789496 master-0 kubenswrapper[4813]: I1203 19:55:29.786284 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"74b33948f209172661a41eab8dd989534e03391e2f9b3dab897af1dbb663716c"} Dec 03 19:55:29.795563 master-0 kubenswrapper[4813]: I1203 19:55:29.794829 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerStarted","Data":"193ee1ad3e7ee183f1ea38494d7735760027689afd79629a8d160747a2494f67"} Dec 03 19:55:29.809273 master-0 kubenswrapper[4813]: I1203 19:55:29.807052 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"0a8f6a401bc81d9be5a9cf7156ef428d64e4a5a0d08e4c992efc6ddc65d0a9c3"} Dec 03 19:55:29.811754 master-0 kubenswrapper[4813]: I1203 19:55:29.811707 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"e73e12ce13ca81b680321fa012f494204d85d5e6386ba40c3313c0c4756967da"} Dec 03 19:55:29.822315 master-0 kubenswrapper[4813]: I1203 19:55:29.822164 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" podStartSLOduration=122.573999355 podStartE2EDuration="2m10.822143538s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.402538909 +0000 UTC m=+164.831337358" lastFinishedPulling="2025-12-03 19:55:28.650683092 +0000 UTC m=+173.079481541" observedRunningTime="2025-12-03 19:55:29.791600622 +0000 UTC m=+174.220399071" watchObservedRunningTime="2025-12-03 19:55:29.822143538 +0000 UTC m=+174.250941987" Dec 03 19:55:29.822755 master-0 kubenswrapper[4813]: I1203 19:55:29.822707 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"e25a90c6c614930a0aba8ebec6ee17a1bf73a834467d4ec954b7d5ad039662fb"} Dec 03 19:55:29.823112 master-0 kubenswrapper[4813]: I1203 19:55:29.823068 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7"] Dec 03 19:55:29.823657 master-0 kubenswrapper[4813]: I1203 19:55:29.823624 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:29.852521 master-0 kubenswrapper[4813]: I1203 19:55:29.848622 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podStartSLOduration=122.797678607 podStartE2EDuration="2m10.84860014s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.607739472 +0000 UTC m=+165.036537921" lastFinishedPulling="2025-12-03 19:55:28.658661005 +0000 UTC m=+173.087459454" observedRunningTime="2025-12-03 19:55:29.839681243 +0000 UTC m=+174.268479692" watchObservedRunningTime="2025-12-03 19:55:29.84860014 +0000 UTC m=+174.277398589" Dec 03 19:55:29.862006 master-0 kubenswrapper[4813]: I1203 19:55:29.853417 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7"] Dec 03 19:55:29.904721 master-0 kubenswrapper[4813]: I1203 19:55:29.904645 4813 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:29.962447 master-0 kubenswrapper[4813]: I1203 19:55:29.961916 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x6vwd" podStartSLOduration=66.961888477 podStartE2EDuration="1m6.961888477s" podCreationTimestamp="2025-12-03 19:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:29.898087316 +0000 UTC m=+174.326885765" watchObservedRunningTime="2025-12-03 19:55:29.961888477 +0000 UTC m=+174.390686966" Dec 03 19:55:29.963133 master-0 kubenswrapper[4813]: I1203 19:55:29.962816 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podStartSLOduration=124.806907441 podStartE2EDuration="2m12.96281128s" podCreationTimestamp="2025-12-03 19:53:17 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.516343011 +0000 UTC m=+164.945141460" lastFinishedPulling="2025-12-03 19:55:28.67224685 +0000 UTC m=+173.101045299" observedRunningTime="2025-12-03 19:55:29.961398944 +0000 UTC m=+174.390197413" watchObservedRunningTime="2025-12-03 19:55:29.96281128 +0000 UTC m=+174.391609729" Dec 03 19:55:30.006108 master-0 kubenswrapper[4813]: I1203 19:55:30.006054 4813 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:30.010828 master-0 kubenswrapper[4813]: I1203 19:55:30.008611 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podStartSLOduration=122.731389893 podStartE2EDuration="2m11.008586753s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.386103953 +0000 UTC m=+164.814902402" lastFinishedPulling="2025-12-03 19:55:28.663300813 +0000 UTC m=+173.092099262" observedRunningTime="2025-12-03 19:55:30.006830499 +0000 UTC m=+174.435628948" watchObservedRunningTime="2025-12-03 19:55:30.008586753 +0000 UTC m=+174.437385192" Dec 03 19:55:30.052938 master-0 kubenswrapper[4813]: I1203 19:55:30.046259 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podStartSLOduration=122.982872739 podStartE2EDuration="2m11.046234349s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.611039035 +0000 UTC m=+165.039837484" lastFinishedPulling="2025-12-03 19:55:28.674400645 +0000 UTC m=+173.103199094" observedRunningTime="2025-12-03 19:55:30.042504164 +0000 UTC m=+174.471302633" watchObservedRunningTime="2025-12-03 19:55:30.046234349 +0000 UTC m=+174.475032808" Dec 03 19:55:30.052938 master-0 kubenswrapper[4813]: I1203 19:55:30.048442 4813 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:30.078568 master-0 kubenswrapper[4813]: I1203 19:55:30.078241 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" podStartSLOduration=122.902102459 podStartE2EDuration="2m11.078220832s" podCreationTimestamp="2025-12-03 19:53:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.474743254 +0000 UTC m=+164.903541703" lastFinishedPulling="2025-12-03 19:55:28.650861627 +0000 UTC m=+173.079660076" observedRunningTime="2025-12-03 19:55:30.077443872 +0000 UTC m=+174.506242321" watchObservedRunningTime="2025-12-03 19:55:30.078220832 +0000 UTC m=+174.507019281" Dec 03 19:55:30.093563 master-0 kubenswrapper[4813]: I1203 19:55:30.093500 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podStartSLOduration=125.030551741 podStartE2EDuration="2m13.09348208s" podCreationTimestamp="2025-12-03 19:53:17 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.595317286 +0000 UTC m=+165.024115735" lastFinishedPulling="2025-12-03 19:55:28.658247625 +0000 UTC m=+173.087046074" observedRunningTime="2025-12-03 19:55:30.091855699 +0000 UTC m=+174.520654158" watchObservedRunningTime="2025-12-03 19:55:30.09348208 +0000 UTC m=+174.522280529" Dec 03 19:55:30.115133 master-0 kubenswrapper[4813]: I1203 19:55:30.115055 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podStartSLOduration=125.489133748 podStartE2EDuration="2m13.115037797s" podCreationTimestamp="2025-12-03 19:53:17 +0000 UTC" firstStartedPulling="2025-12-03 19:55:21.103867473 +0000 UTC m=+165.532665922" lastFinishedPulling="2025-12-03 19:55:28.729771522 +0000 UTC m=+173.158569971" observedRunningTime="2025-12-03 19:55:30.115033017 +0000 UTC m=+174.543831466" watchObservedRunningTime="2025-12-03 19:55:30.115037797 +0000 UTC m=+174.543836256" Dec 03 19:55:30.143832 master-0 kubenswrapper[4813]: I1203 19:55:30.141160 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:33.517732 master-0 kubenswrapper[4813]: I1203 19:55:33.517383 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7"] Dec 03 19:55:33.846372 master-0 kubenswrapper[4813]: I1203 19:55:33.846091 4813 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-72rrb" event={"ID":"371917da-b783-4acc-81af-1cfc903269f4","Type":"ContainerStarted","Data":"da02e86b71f1ead478d3255496acbb74ad50db8ef78b294b1b2845925a1f0358"} Dec 03 19:55:33.864094 master-0 kubenswrapper[4813]: I1203 19:55:33.862439 4813 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-72rrb" podStartSLOduration=6.602875728 podStartE2EDuration="14.862423501s" podCreationTimestamp="2025-12-03 19:55:19 +0000 UTC" firstStartedPulling="2025-12-03 19:55:20.483905466 +0000 UTC m=+164.912703905" lastFinishedPulling="2025-12-03 19:55:28.743453229 +0000 UTC m=+173.172251678" observedRunningTime="2025-12-03 19:55:33.861298522 +0000 UTC m=+178.290096971" watchObservedRunningTime="2025-12-03 19:55:33.862423501 +0000 UTC m=+178.291221950" Dec 03 19:55:34.185827 master-0 kubenswrapper[4813]: I1203 19:55:34.185736 4813 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-bj4vz"] Dec 03 19:55:34.186703 master-0 kubenswrapper[4813]: I1203 19:55:34.186680 4813 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:34.188648 master-0 kubenswrapper[4813]: I1203 19:55:34.188591 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 19:55:34.188857 master-0 kubenswrapper[4813]: I1203 19:55:34.188800 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 19:55:34.189129 master-0 kubenswrapper[4813]: I1203 19:55:34.189110 4813 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 19:55:34.189355 master-0 kubenswrapper[4813]: I1203 19:55:34.189181 4813 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 19:55:34.202911 master-0 kubenswrapper[4813]: I1203 19:55:34.193498 4813 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-bj4vz"] Dec 03 19:55:34.236570 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 03 19:55:34.261517 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 03 19:55:34.261738 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 03 19:55:34.268666 master-0 systemd[1]: kubelet.service: Consumed 11.736s CPU time. Dec 03 19:55:34.280998 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 19:55:34.413592 master-0 kubenswrapper[9368]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:55:34.413592 master-0 kubenswrapper[9368]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 19:55:34.413592 master-0 kubenswrapper[9368]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:55:34.413592 master-0 kubenswrapper[9368]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:55:34.414876 master-0 kubenswrapper[9368]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 19:55:34.414876 master-0 kubenswrapper[9368]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 19:55:34.414876 master-0 kubenswrapper[9368]: I1203 19:55:34.413710 9368 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 19:55:34.415928 master-0 kubenswrapper[9368]: W1203 19:55:34.415912 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:55:34.415928 master-0 kubenswrapper[9368]: W1203 19:55:34.415926 9368 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:55:34.415928 master-0 kubenswrapper[9368]: W1203 19:55:34.415930 9368 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415934 9368 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415938 9368 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415941 9368 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415945 9368 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415949 9368 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415952 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415956 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415959 9368 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415962 9368 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415966 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415969 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415973 9368 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415982 9368 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415986 9368 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415989 9368 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415992 9368 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.415996 9368 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.416000 9368 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.416003 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:55:34.416050 master-0 kubenswrapper[9368]: W1203 19:55:34.416008 9368 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416012 9368 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416017 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416021 9368 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416025 9368 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416030 9368 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416034 9368 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416038 9368 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416042 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416046 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416050 9368 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416053 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416057 9368 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416062 9368 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416066 9368 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416069 9368 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416073 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416076 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416080 9368 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:55:34.417213 master-0 kubenswrapper[9368]: W1203 19:55:34.416085 9368 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416089 9368 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416092 9368 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416096 9368 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416099 9368 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416103 9368 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416107 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416116 9368 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416119 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416123 9368 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416126 9368 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416130 9368 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416133 9368 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416137 9368 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416140 9368 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416144 9368 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416149 9368 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416152 9368 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416156 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416164 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416169 9368 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:55:34.417930 master-0 kubenswrapper[9368]: W1203 19:55:34.416172 9368 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416176 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416179 9368 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416183 9368 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416187 9368 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416190 9368 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416194 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416197 9368 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416200 9368 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: W1203 19:55:34.416204 9368 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416300 9368 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416309 9368 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416316 9368 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416321 9368 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416326 9368 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416330 9368 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416336 9368 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416342 9368 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416346 9368 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416350 9368 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416355 9368 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416359 9368 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 19:55:34.418499 master-0 kubenswrapper[9368]: I1203 19:55:34.416363 9368 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416367 9368 flags.go:64] FLAG: --cgroup-root="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416371 9368 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416375 9368 flags.go:64] FLAG: --client-ca-file="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416379 9368 flags.go:64] FLAG: --cloud-config="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416384 9368 flags.go:64] FLAG: --cloud-provider="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416388 9368 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416393 9368 flags.go:64] FLAG: --cluster-domain="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416399 9368 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416403 9368 flags.go:64] FLAG: --config-dir="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416407 9368 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416412 9368 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416417 9368 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416422 9368 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416426 9368 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416431 9368 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416435 9368 flags.go:64] FLAG: --contention-profiling="false" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416438 9368 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416443 9368 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416447 9368 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416451 9368 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416456 9368 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416461 9368 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416465 9368 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416469 9368 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 19:55:34.419075 master-0 kubenswrapper[9368]: I1203 19:55:34.416473 9368 flags.go:64] FLAG: --enable-server="true" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416476 9368 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416481 9368 flags.go:64] FLAG: --event-burst="100" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416485 9368 flags.go:64] FLAG: --event-qps="50" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416490 9368 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416494 9368 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416498 9368 flags.go:64] FLAG: --eviction-hard="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416503 9368 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416507 9368 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416511 9368 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416515 9368 flags.go:64] FLAG: --eviction-soft="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416519 9368 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416523 9368 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416527 9368 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416531 9368 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416537 9368 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416541 9368 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416544 9368 flags.go:64] FLAG: --feature-gates="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416550 9368 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416554 9368 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416558 9368 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416562 9368 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416566 9368 flags.go:64] FLAG: --healthz-port="10248" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416570 9368 flags.go:64] FLAG: --help="false" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416574 9368 flags.go:64] FLAG: --hostname-override="" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416578 9368 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 19:55:34.419962 master-0 kubenswrapper[9368]: I1203 19:55:34.416582 9368 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416586 9368 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416590 9368 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416596 9368 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416600 9368 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416604 9368 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416608 9368 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416612 9368 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416617 9368 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416621 9368 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416625 9368 flags.go:64] FLAG: --kube-reserved="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416629 9368 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416633 9368 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416637 9368 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416641 9368 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416645 9368 flags.go:64] FLAG: --lock-file="" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416649 9368 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416653 9368 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416657 9368 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416663 9368 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416667 9368 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416673 9368 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416676 9368 flags.go:64] FLAG: --logging-format="text" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416680 9368 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416685 9368 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 19:55:34.420582 master-0 kubenswrapper[9368]: I1203 19:55:34.416689 9368 flags.go:64] FLAG: --manifest-url="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416693 9368 flags.go:64] FLAG: --manifest-url-header="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416698 9368 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416702 9368 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416707 9368 flags.go:64] FLAG: --max-pods="110" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416711 9368 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416715 9368 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416719 9368 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416723 9368 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416727 9368 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416731 9368 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416735 9368 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416746 9368 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416750 9368 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416754 9368 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416758 9368 flags.go:64] FLAG: --pod-cidr="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416762 9368 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416769 9368 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416772 9368 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416780 9368 flags.go:64] FLAG: --pods-per-core="0" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416784 9368 flags.go:64] FLAG: --port="10250" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416788 9368 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416806 9368 flags.go:64] FLAG: --provider-id="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416810 9368 flags.go:64] FLAG: --qos-reserved="" Dec 03 19:55:34.421185 master-0 kubenswrapper[9368]: I1203 19:55:34.416815 9368 flags.go:64] FLAG: --read-only-port="10255" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416819 9368 flags.go:64] FLAG: --register-node="true" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416823 9368 flags.go:64] FLAG: --register-schedulable="true" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416827 9368 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416837 9368 flags.go:64] FLAG: --registry-burst="10" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416842 9368 flags.go:64] FLAG: --registry-qps="5" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416846 9368 flags.go:64] FLAG: --reserved-cpus="" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416850 9368 flags.go:64] FLAG: --reserved-memory="" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416855 9368 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416859 9368 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416864 9368 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416868 9368 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416872 9368 flags.go:64] FLAG: --runonce="false" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416876 9368 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416880 9368 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416884 9368 flags.go:64] FLAG: --seccomp-default="false" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416888 9368 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416892 9368 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416896 9368 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416900 9368 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416904 9368 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416908 9368 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416913 9368 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416916 9368 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416920 9368 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 19:55:34.422564 master-0 kubenswrapper[9368]: I1203 19:55:34.416924 9368 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416929 9368 flags.go:64] FLAG: --system-cgroups="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416933 9368 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416939 9368 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416943 9368 flags.go:64] FLAG: --tls-cert-file="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416947 9368 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416953 9368 flags.go:64] FLAG: --tls-min-version="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416957 9368 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416961 9368 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416965 9368 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416969 9368 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416976 9368 flags.go:64] FLAG: --v="2" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416983 9368 flags.go:64] FLAG: --version="false" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416988 9368 flags.go:64] FLAG: --vmodule="" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416993 9368 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: I1203 19:55:34.416997 9368 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417095 9368 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417100 9368 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417105 9368 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417109 9368 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417112 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417116 9368 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417119 9368 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:55:34.423467 master-0 kubenswrapper[9368]: W1203 19:55:34.417123 9368 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417127 9368 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417130 9368 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417134 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417137 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417141 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417144 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417148 9368 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417151 9368 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417154 9368 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417158 9368 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417162 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417165 9368 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417169 9368 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417173 9368 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417176 9368 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417180 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417183 9368 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417187 9368 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417190 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:55:34.425254 master-0 kubenswrapper[9368]: W1203 19:55:34.417196 9368 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417203 9368 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417207 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417211 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417214 9368 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417218 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417221 9368 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417225 9368 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417228 9368 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417232 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417236 9368 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417239 9368 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417243 9368 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417246 9368 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417250 9368 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417253 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417257 9368 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417260 9368 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417264 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417267 9368 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:55:34.426128 master-0 kubenswrapper[9368]: W1203 19:55:34.417270 9368 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417274 9368 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417277 9368 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417281 9368 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417285 9368 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417288 9368 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417292 9368 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417295 9368 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417299 9368 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417303 9368 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417309 9368 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417313 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417319 9368 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417325 9368 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417329 9368 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417333 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417337 9368 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417341 9368 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417345 9368 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:55:34.426830 master-0 kubenswrapper[9368]: W1203 19:55:34.417348 9368 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.417352 9368 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.417355 9368 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.417359 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.417363 9368 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.417366 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: I1203 19:55:34.417372 9368 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: I1203 19:55:34.425518 9368 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: I1203 19:55:34.425546 9368 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425628 9368 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425639 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425645 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425651 9368 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425657 9368 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:55:34.428010 master-0 kubenswrapper[9368]: W1203 19:55:34.425662 9368 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425668 9368 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425672 9368 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425677 9368 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425706 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425711 9368 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425716 9368 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425721 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425726 9368 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425730 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425735 9368 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425739 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425743 9368 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425748 9368 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425752 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425758 9368 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425764 9368 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425769 9368 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425777 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:55:34.428484 master-0 kubenswrapper[9368]: W1203 19:55:34.425783 9368 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425787 9368 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425805 9368 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425810 9368 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425814 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425819 9368 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425824 9368 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425829 9368 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425835 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425840 9368 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425844 9368 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425848 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425853 9368 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425858 9368 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425862 9368 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425867 9368 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425872 9368 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425878 9368 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425883 9368 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425888 9368 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:55:34.429218 master-0 kubenswrapper[9368]: W1203 19:55:34.425892 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425897 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425901 9368 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425906 9368 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425910 9368 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425915 9368 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425921 9368 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425925 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425929 9368 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425934 9368 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425938 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425945 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425949 9368 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425953 9368 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425957 9368 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425962 9368 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425968 9368 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425974 9368 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425980 9368 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425985 9368 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:55:34.429992 master-0 kubenswrapper[9368]: W1203 19:55:34.425990 9368 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.425995 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426000 9368 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426006 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426011 9368 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426016 9368 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426021 9368 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426026 9368 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: I1203 19:55:34.426034 9368 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426212 9368 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426222 9368 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426228 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426234 9368 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426239 9368 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426244 9368 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 19:55:34.430742 master-0 kubenswrapper[9368]: W1203 19:55:34.426248 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426253 9368 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426258 9368 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426262 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426267 9368 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426272 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426276 9368 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426281 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426285 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426290 9368 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426294 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426299 9368 feature_gate.go:330] unrecognized feature gate: Example Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426303 9368 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426309 9368 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426314 9368 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426319 9368 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426324 9368 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426330 9368 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426336 9368 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426342 9368 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 19:55:34.431291 master-0 kubenswrapper[9368]: W1203 19:55:34.426347 9368 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426352 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426357 9368 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426362 9368 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426367 9368 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426372 9368 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426377 9368 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426382 9368 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426386 9368 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426390 9368 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426394 9368 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426399 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426413 9368 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426418 9368 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426423 9368 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426428 9368 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426432 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426436 9368 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426441 9368 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426445 9368 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 19:55:34.432109 master-0 kubenswrapper[9368]: W1203 19:55:34.426449 9368 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426454 9368 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426458 9368 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426463 9368 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426467 9368 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426471 9368 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426475 9368 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426480 9368 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426484 9368 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426488 9368 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426493 9368 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426499 9368 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426503 9368 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426507 9368 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426513 9368 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426519 9368 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426524 9368 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426529 9368 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426534 9368 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 19:55:34.432893 master-0 kubenswrapper[9368]: W1203 19:55:34.426538 9368 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426543 9368 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426549 9368 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426553 9368 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426557 9368 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426562 9368 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: W1203 19:55:34.426567 9368 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.426575 9368 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.426766 9368 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.428622 9368 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.428704 9368 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.429001 9368 server.go:997] "Starting client certificate rotation" Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.429013 9368 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 19:55:34.433570 master-0 kubenswrapper[9368]: I1203 19:55:34.429288 9368 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 14:27:42.507140407 +0000 UTC Dec 03 19:55:34.434094 master-0 kubenswrapper[9368]: I1203 19:55:34.429357 9368 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h32m8.077786477s for next certificate rotation Dec 03 19:55:34.434094 master-0 kubenswrapper[9368]: I1203 19:55:34.430138 9368 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 19:55:34.434094 master-0 kubenswrapper[9368]: I1203 19:55:34.431940 9368 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 19:55:34.435756 master-0 kubenswrapper[9368]: I1203 19:55:34.435736 9368 log.go:25] "Validated CRI v1 runtime API" Dec 03 19:55:34.438168 master-0 kubenswrapper[9368]: I1203 19:55:34.438036 9368 log.go:25] "Validated CRI v1 image API" Dec 03 19:55:34.439204 master-0 kubenswrapper[9368]: I1203 19:55:34.439183 9368 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 19:55:34.444747 master-0 kubenswrapper[9368]: I1203 19:55:34.444702 9368 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a110c2ad-b51b-427d-8eb4-4344f49e01ee:/dev/vda3] Dec 03 19:55:34.445178 master-0 kubenswrapper[9368]: I1203 19:55:34.444748 9368 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm major:0 minor:136 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm major:0 minor:342 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm major:0 minor:315 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm major:0 minor:387 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm major:0 minor:171 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm major:0 minor:346 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm major:0 minor:320 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm major:0 minor:358 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm major:0 minor:488 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm major:0 minor:324 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm major:0 minor:326 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm major:0 minor:163 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm major:0 minor:317 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm major:0 minor:121 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm major:0 minor:382 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm major:0 minor:331 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm major:0 minor:122 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm:{mountpoint:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm major:0 minor:323 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd:{mountpoint:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0c45d22f-1492-47d7-83b6-6dd356a8454d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/0c45d22f-1492-47d7-83b6-6dd356a8454d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:120 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp:{mountpoint:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp major:0 minor:286 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq:{mountpoint:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh:{mountpoint:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh major:0 minor:336 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq major:0 minor:170 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:169 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7:{mountpoint:/var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7 major:0 minor:487 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k:{mountpoint:/var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k major:0 minor:341 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:333 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4:{mountpoint:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85:{mountpoint:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85 major:0 minor:146 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9:{mountpoint:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9 major:0 minor:340 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb:{mountpoint:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb major:0 minor:119 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls major:0 minor:77 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n:{mountpoint:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw:{mountpoint:/var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg:{mountpoint:/var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph:{mountpoint:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf:{mountpoint:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2:{mountpoint:/var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2 major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4316c8d-a1d3-4e51-83cc-d0eecb809924/volumes/kubernetes.io~projected/kube-api-access-74dvx:{mountpoint:/var/lib/kubelet/pods/b4316c8d-a1d3-4e51-83cc-d0eecb809924/volumes/kubernetes.io~projected/kube-api-access-74dvx major:0 minor:338 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch:{mountpoint:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs:{mountpoint:/var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4:{mountpoint:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4 major:0 minor:381 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82:{mountpoint:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82 major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9:{mountpoint:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9 major:0 minor:162 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:161 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln:{mountpoint:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4:{mountpoint:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4 major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8:{mountpoint:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8 major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access major:0 minor:322 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4:{mountpoint:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4 major:0 minor:330 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/42d502721e48298553a7c34a551ca0d0505ac2db22a13ff58996512369208007/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-104:{mountpoint:/var/lib/containers/storage/overlay/a0f69d7914635c64595685e6bb5a9d38e943f729dc6be1cb3ad8ccfc0c67cd25/merged major:0 minor:104 fsType:overlay blockSize:0} overlay_0-124:{mountpoint:/var/lib/containers/storage/overlay/7553b136442b262bdba3952c9950e5fb24caa30dcae9fe6cc01c85b7e958db26/merged major:0 minor:124 fsType:overlay blockSize:0} overlay_0-127:{mountpoint:/var/lib/containers/storage/overlay/0bd69db7ee13ba17fdb90b4e845e0b08823660e03858a2fefa9c745e7a3260b8/merged major:0 minor:127 fsType:overlay blockSize:0} overlay_0-129:{mountpoint:/var/lib/containers/storage/overlay/1eb584e6c3999a082e46cf43d8b709ac816b0d880e0138c879631b4da8ed930e/merged major:0 minor:129 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/1162b9d5b06f470702e57ad9978e7ed92220a1d2cdf231f28dc981aa42a42acb/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-133:{mountpoint:/var/lib/containers/storage/overlay/328d4aa0a88b6bb9bdca4814fcaf2034cbd0dd54786349225d46e254ed685e20/merged major:0 minor:133 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/1004c29d4dc2e5ae9f71f2b283a0161528285c854807a1dbdf6ef884115bb780/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-147:{mountpoint:/var/lib/containers/storage/overlay/f75f2289eb91a86ce692ca0d840dbbe974ddb5b8c357cd7b938271cbfa785b82/merged major:0 minor:147 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/bc4f71e8b8d82e0ad27bd2f134fd5975227dcc209d49bd614753fdc0e79baf93/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-151:{mountpoint:/var/lib/containers/storage/overlay/c071acf2c42a58f5fd09cb85c464730b75ab6dff05815d5752fc0606b1c0ccc0/merged major:0 minor:151 fsType:overlay blockSize:0} overlay_0-153:{mountpoint:/var/lib/containers/storage/overlay/80c1684eb890fb1daa732afcbbd75db2b619a73b440b9d8c2bccc078dc877a36/merged major:0 minor:153 fsType:overlay blockSize:0} overlay_0-155:{mountpoint:/var/lib/containers/storage/overlay/5753da5558b9f8596dab87b5dbf7d299de3c68169d41040c325e68f583623825/merged major:0 minor:155 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/2cbd7bbce53634c485921b252dc3061642d0855d651b865add08036c22585321/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/e2a2a7fd2682f648d30a42254b47101c842bf68b8e2a60186ad746707892f76a/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-167:{mountpoint:/var/lib/containers/storage/overlay/75e8e9d8e8519bd4cf96fd700f21930d4d0bd5b668b5431b8337458011a7d74d/merged major:0 minor:167 fsType:overlay blockSize:0} overlay_0-173:{mountpoint:/var/lib/containers/storage/overlay/42ba82f93953383102c71431ae42faf4e61b9734194f958c695db0e7cf201504/merged major:0 minor:173 fsType:overlay blockSize:0} overlay_0-181:{mountpoint:/var/lib/containers/storage/overlay/5c3360d970e6e048298c4d9620de740a87fb1a67a55381dd871c6a9829c8b9b9/merged major:0 minor:181 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/a565e58719d5970fc9bfb6b872797b60ec22a84ef0b2761fd6fa3c88d1962e13/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/75ddef7480f281ec07ceca3ce28f6c684a389b0b57e289673849e30c4b0081b6/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/749a2a069dd05cd62cd3d08ecace9255c2b9f5f83cdaf0801c5159793285337d/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/062819f0492cedc7321aae8a6f04970c6d37225b8e9e71e76c4703e921cdc989/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/154db8920aed66f693824269968b5900519a3200f2964785bebbd77a8f93591a/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/658fa1657c4947ab3420052e1438ec6755e21ef571576b9538fe1278ca214b9c/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-211:{mountpoint:/var/lib/containers/storage/overlay/cd0e0c415170062505067cde2da0b5e6ec7259656863f73159217b9563cef5a3/merged major:0 minor:211 fsType:overlay blockSize:0} overlay_0-213:{mountpoint:/var/lib/containers/storage/overlay/c32c9e29ea7b415b5875cc854cecf3e7a5b29c7b52a830d1035bdf00878f2553/merged major:0 minor:213 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/776841f50d586cda6eb39e144487c3c0b98792ce5a700475b7a9dab55c779a3d/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-226:{mountpoint:/var/lib/containers/storage/overlay/9febea4b01b238ed11f90be195d61cfa2c2d3d684569bbcee9a4b95a9589e32a/merged major:0 minor:226 fsType:overlay blockSize:0} overlay_0-234:{mountpoint:/var/lib/containers/storage/overlay/826fb9274729a04ac8bf1f2dc7809bbfdcd1ef2d903ce8925e12a96c0d54ccdc/merged major:0 minor:234 fsType:overlay blockSize:0} overlay_0-242:{mountpoint:/var/lib/containers/storage/overlay/3ea75403be9ff49f0d89e395fb71c8afdcf31c2dca4bf3be27c0bfc9c2b02b1f/merged major:0 minor:242 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/542f68d90ba74a108bed62852741fecae67d09e6d48c00aab4641f44fc4d9b27/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-258:{mountpoint:/var/lib/containers/storage/overlay/577a4d9cb019e724a8e12ef1daee525d6fa1600006aef5edbb01e7dcd910283b/merged major:0 minor:258 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/9507e651f29e86bf69ac5f0fa8947bc090d60c10765e92eaecfe6ed356113cd0/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-271:{mountpoint:/var/lib/containers/storage/overlay/c560ad2fe306a287e3b7e2af00ede38dc48fd77694968e2249e1766389296e9d/merged major:0 minor:271 fsType:overlay blockSize:0} overlay_0-344:{mountpoint:/var/lib/containers/storage/overlay/0d2e73df435b5c0a11afee4c921f2059332b9ae28d1942e279e4876055cb3c49/merged major:0 minor:344 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/0cabac5ebcc4204670610148af33c222eb06802118745b08f536d38696fbcb8d/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/d7cf977cc989de992a217f0075826286da5e96a4a849f319fedd84ce7fe63bf2/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/0b6b91b67dfb1ed892203184f5fcf8a0515e22da4351249925d088a82543de78/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-354:{mountpoint:/var/lib/containers/storage/overlay/a4cf883831a1b1ad73ed376a48b26380e5d77283245da5c25e12fd1684801427/merged major:0 minor:354 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/357df791a42b82a3f0a028f1fa70e430ff1f4762ddb53dc6da86b2c969315824/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-360:{mountpoint:/var/lib/containers/storage/overlay/7ddddda0c20864a2ba21f7bd6bdd3684a476ddec7c8904bba715aedfb231f085/merged major:0 minor:360 fsType:overlay blockSize:0} overlay_0-362:{mountpoint:/var/lib/containers/storage/overlay/63e4b31005f005596c90661bcd11ea6b0d7f6763d3f48f6674ba2f76d9631872/merged major:0 minor:362 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/d92751fea002c7312927f4c988ef66b0e5f5e32a423ef53eae60aa655aff8b4f/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/714b90d43dafb90c0316a8aa5770ffb0680b71be5ea6c56fa1bdbbe88e555ed6/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/cb02664550bde6617b9760e31c58cab812be593725e7360e72a4a1b06c7d0aeb/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/739bca9868adbe0ae08dc4d285f26c54badb979e34da856ec7aacdd9e8939bdd/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-379:{mountpoint:/var/lib/containers/storage/overlay/742873d59edab48b4d0150a0ca9e690e84ffd20e86825d4e39fd5bcb9ebc0795/merged major:0 minor:379 fsType:overlay blockSize:0} overlay_0-384:{mountpoint:/var/lib/containers/storage/overlay/2d66b49203bddda94e368a202d17e06d683060a7ebb9ae7ecaf7e896b9519926/merged major:0 minor:384 fsType:overlay blockSize:0} overlay_0-389:{mountpoint:/var/lib/containers/storage/overlay/61d68b7b6b3bc37193191b703eaad6aa7631530c25d5455ac08a025c63e279c9/merged major:0 minor:389 fsType:overlay blockSize:0} overlay_0-391:{mountpoint:/var/lib/containers/storage/overlay/5fcfca9079ea12684bfa9058df007d8a08f27b18f97257ee161629dabeaf4d8c/merged major:0 minor:391 fsType:overlay blockSize:0} overlay_0-393:{mountpoint:/var/lib/containers/storage/overlay/a503c87b4b027b796b1e94113b14ca5c635596c129efb72c60325dd8162b6f9a/merged major:0 minor:393 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/78e87158b2620022191654c27155d56b1f90aa3adf478c9567e88396e255b499/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-397:{mountpoint:/var/lib/containers/storage/overlay/14bf6ef8f13b7050948309170f0cd58405251532de64b68bc9562804af025270/merged major:0 minor:397 fsType:overlay blockSize:0} overlay_0-399:{mountpoint:/var/lib/containers/storage/overlay/15ec5ef5cb2094bbe12c651082d36f397d4327694982368c08c50df03cd24368/merged major:0 minor:399 fsType:overlay blockSize:0} overlay_0-401:{mountpoint:/var/lib/containers/storage/overlay/1e64f575c7b1d7735e91512126037ab6c94b13dfc988906ebff41f9ba9b2ec2b/merged major:0 minor:401 fsType:overlay blockSize:0} overlay_0-403:{mountpoint:/var/lib/containers/storage/overlay/0424e210a4346ef452c66b936e59558976bbb8761fe81412996a8cd2bcec727c/merged major:0 minor:403 fsType:overlay blockSize:0} overlay_0-405:{mountpoint:/var/lib/containers/storage/overlay/1458e50c7d4eb7ed74fd7ce2a2da5206b71dd9a0d64715b1b4c74fa8521cf6d6/merged major:0 minor:405 fsType:overlay blockSize:0} overlay_0-407:{mountpoint:/var/lib/containers/storage/overlay/0a1a2c379b383ef069b60697c39d6bb7d027046834e8c1f38cef4a4eca8cf7ef/merged major:0 minor:407 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/80fec5273b3c75f9ba02b2a183f01f04c4589f174c37333f196d1d3be3d34002/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/a0ac6d0acd420fb457c8def6a16b56761986ac41206171c86cf2b7702536f82b/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/6db36944aa97848b1163a8f35f5a88f701dc80873a1f4bc05f32d52978679626/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/a607d05062390dc67575ae5f9b5e9ea470b8485c41ee34bdb29837c444fcce97/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-469:{mountpoint:/var/lib/containers/storage/overlay/fcadd55d474b4729398f24c0eeab4955a838c8d35c28185a03e92c0c90e2ba45/merged major:0 minor:469 fsType:overlay blockSize:0} overlay_0-478:{mountpoint:/var/lib/containers/storage/overlay/44d4c16a44479fadbb6b64debcd9de88d9d2b9bd32e706674591b238bce78a90/merged major:0 minor:478 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/af0eab2f5a9fd644563da683dfca8f0d6ebdbbf25170454e732d8341094a1968/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/17d305f53778cbe6e1413f914d9f6e44c0e9140b6901e3d4422ccf1234ebbf4e/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/c9b974bfded6fd074eb6070fa69063dec031f426f444af22d03a7759752120d3/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/08963768f75114cd9191d22721f1a1a86b585929e3b1a7b5747ff378a14dde17/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/9aefb9f47e3c8709a30a7a83982b435dcafddb2345f1a76614c1108a44505ff4/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/3ddb3633fb7619481a94cea31431ddea1872e82371adf01e2a4c6357f3419f72/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/0bae5d1b6c59876e0f85328ef66950e0be20f36c12937bcbbc848fb02cf9bea3/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/517a64f84dbd613a2c1b944d82065fac446d1aba259554a602c09d8d2652b066/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/3266d973e3c73506d5ba26529993154805f5a1097fa887d32f1c6db2423b276d/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/6247288afed6f088e479ed95a408aea57181ac74de9404a87b2bd0568d39c9b2/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/c915a06aec4f81cce69757183eac147989ff6049af09397a85602c14bdbc1cdc/merged major:0 minor:82 fsType:overlay blockSize:0}] Dec 03 19:55:34.480146 master-0 kubenswrapper[9368]: I1203 19:55:34.479132 9368 manager.go:217] Machine: {Timestamp:2025-12-03 19:55:34.477613626 +0000 UTC m=+0.138863537 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:9870f3c6b33d40089e247d1fa3d9248c SystemUUID:9870f3c6-b33d-4008-9e24-7d1fa3d9248c BootID:2118df0c-6317-4582-908c-71a63e50558d Filesystems:[{Device:overlay_0-393 DeviceMajor:0 DeviceMinor:393 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7 DeviceMajor:0 DeviceMinor:487 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-147 DeviceMajor:0 DeviceMinor:147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm DeviceMajor:0 DeviceMinor:323 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm DeviceMajor:0 DeviceMinor:346 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4 DeviceMajor:0 DeviceMinor:381 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-104 DeviceMajor:0 DeviceMinor:104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8 DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-391 DeviceMajor:0 DeviceMinor:391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82 DeviceMajor:0 DeviceMinor:185 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm DeviceMajor:0 DeviceMinor:315 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-399 DeviceMajor:0 DeviceMinor:399 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm DeviceMajor:0 DeviceMinor:358 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-379 DeviceMajor:0 DeviceMinor:379 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4 DeviceMajor:0 DeviceMinor:303 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-403 DeviceMajor:0 DeviceMinor:403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq DeviceMajor:0 DeviceMinor:170 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-213 DeviceMajor:0 DeviceMinor:213 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-271 DeviceMajor:0 DeviceMinor:271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-155 DeviceMajor:0 DeviceMinor:155 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k DeviceMajor:0 DeviceMinor:341 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0c45d22f-1492-47d7-83b6-6dd356a8454d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:120 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-173 DeviceMajor:0 DeviceMinor:173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-234 DeviceMajor:0 DeviceMinor:234 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9 DeviceMajor:0 DeviceMinor:340 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm DeviceMajor:0 DeviceMinor:342 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm DeviceMajor:0 DeviceMinor:488 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm DeviceMajor:0 DeviceMinor:320 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq DeviceMajor:0 DeviceMinor:319 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-384 DeviceMajor:0 DeviceMinor:384 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh DeviceMajor:0 DeviceMinor:336 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-124 DeviceMajor:0 DeviceMinor:124 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-153 DeviceMajor:0 DeviceMinor:153 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-397 DeviceMajor:0 DeviceMinor:397 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb DeviceMajor:0 DeviceMinor:119 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4 DeviceMajor:0 DeviceMinor:330 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln DeviceMajor:0 DeviceMinor:314 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:333 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-360 DeviceMajor:0 DeviceMinor:360 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2 DeviceMajor:0 DeviceMinor:118 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-242 DeviceMajor:0 DeviceMinor:242 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4 DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm DeviceMajor:0 DeviceMinor:387 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-478 DeviceMajor:0 DeviceMinor:478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm DeviceMajor:0 DeviceMinor:121 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw DeviceMajor:0 DeviceMinor:386 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-167 DeviceMajor:0 DeviceMinor:167 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:161 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9 DeviceMajor:0 DeviceMinor:162 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm DeviceMajor:0 DeviceMinor:163 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm DeviceMajor:0 DeviceMinor:317 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-469 DeviceMajor:0 DeviceMinor:469 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-151 DeviceMajor:0 DeviceMinor:151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-389 DeviceMajor:0 DeviceMinor:389 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:169 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-344 DeviceMajor:0 DeviceMinor:344 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm DeviceMajor:0 DeviceMinor:122 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm DeviceMajor:0 DeviceMinor:331 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-401 DeviceMajor:0 DeviceMinor:401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-405 DeviceMajor:0 DeviceMinor:405 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-407 DeviceMajor:0 DeviceMinor:407 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-354 DeviceMajor:0 DeviceMinor:354 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-127 DeviceMajor:0 DeviceMinor:127 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-133 DeviceMajor:0 DeviceMinor:133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-226 DeviceMajor:0 DeviceMinor:226 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-362 DeviceMajor:0 DeviceMinor:362 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:186 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-211 DeviceMajor:0 DeviceMinor:211 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph DeviceMajor:0 DeviceMinor:302 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b4316c8d-a1d3-4e51-83cc-d0eecb809924/volumes/kubernetes.io~projected/kube-api-access-74dvx DeviceMajor:0 DeviceMinor:338 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:322 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-258 DeviceMajor:0 DeviceMinor:258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm DeviceMajor:0 DeviceMinor:324 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg DeviceMajor:0 DeviceMinor:135 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85 DeviceMajor:0 DeviceMinor:146 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm DeviceMajor:0 DeviceMinor:326 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm DeviceMajor:0 DeviceMinor:171 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-129 DeviceMajor:0 DeviceMinor:129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-181 DeviceMajor:0 DeviceMinor:181 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm DeviceMajor:0 DeviceMinor:382 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:77 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm DeviceMajor:0 DeviceMinor:136 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp DeviceMajor:0 DeviceMinor:286 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:08c01ca5f1fe5f2 MacAddress:da:fa:76:4c:74:41 Speed:10000 Mtu:8900} {Name:0c22de28b514bd9 MacAddress:4e:14:52:68:30:38 Speed:10000 Mtu:8900} {Name:15fa7ece9624e47 MacAddress:c6:2b:13:99:74:29 Speed:10000 Mtu:8900} {Name:38072447ae41285 MacAddress:92:a5:a1:be:a6:9f Speed:10000 Mtu:8900} {Name:49efa7facfce8d5 MacAddress:3e:bb:1a:68:9e:3e Speed:10000 Mtu:8900} {Name:988c74f72d6d398 MacAddress:aa:83:55:b9:ed:6f Speed:10000 Mtu:8900} {Name:a336a885dee0216 MacAddress:d6:3f:31:49:e0:b3 Speed:10000 Mtu:8900} {Name:b4885c85229f163 MacAddress:4e:9e:9a:21:1e:13 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:2a:36:a0:22:39:c7 Speed:0 Mtu:8900} {Name:c8fa62db9ae1d5a MacAddress:fe:0b:61:d6:83:1d Speed:10000 Mtu:8900} {Name:db6143edbd1b68c MacAddress:1e:0e:41:c4:03:10 Speed:10000 Mtu:8900} {Name:e6304ea619f0b99 MacAddress:2e:85:23:56:cd:15 Speed:10000 Mtu:8900} {Name:ec90b46e5817f62 MacAddress:2a:dd:4f:70:1d:d2 Speed:10000 Mtu:8900} {Name:ef04edbf9389316 MacAddress:ca:72:6e:0e:e0:f4 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c1:91:ba Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:45:dc:6d Speed:-1 Mtu:9000} {Name:f6cfc0641f7e192 MacAddress:fe:58:40:49:74:41 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:1a:18:db:b8:db:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 19:55:34.480146 master-0 kubenswrapper[9368]: I1203 19:55:34.480126 9368 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 19:55:34.480542 master-0 kubenswrapper[9368]: I1203 19:55:34.480338 9368 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 19:55:34.480669 master-0 kubenswrapper[9368]: I1203 19:55:34.480629 9368 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 19:55:34.481019 master-0 kubenswrapper[9368]: I1203 19:55:34.480919 9368 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 19:55:34.481212 master-0 kubenswrapper[9368]: I1203 19:55:34.480951 9368 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 19:55:34.481365 master-0 kubenswrapper[9368]: I1203 19:55:34.481255 9368 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 19:55:34.481365 master-0 kubenswrapper[9368]: I1203 19:55:34.481270 9368 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 19:55:34.481365 master-0 kubenswrapper[9368]: I1203 19:55:34.481281 9368 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 19:55:34.481365 master-0 kubenswrapper[9368]: I1203 19:55:34.481325 9368 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 19:55:34.481516 master-0 kubenswrapper[9368]: I1203 19:55:34.481500 9368 state_mem.go:36] "Initialized new in-memory state store" Dec 03 19:55:34.481638 master-0 kubenswrapper[9368]: I1203 19:55:34.481611 9368 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 19:55:34.481739 master-0 kubenswrapper[9368]: I1203 19:55:34.481706 9368 kubelet.go:418] "Attempting to sync node with API server" Dec 03 19:55:34.481798 master-0 kubenswrapper[9368]: I1203 19:55:34.481743 9368 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 19:55:34.481798 master-0 kubenswrapper[9368]: I1203 19:55:34.481760 9368 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 19:55:34.481862 master-0 kubenswrapper[9368]: I1203 19:55:34.481804 9368 kubelet.go:324] "Adding apiserver pod source" Dec 03 19:55:34.481862 master-0 kubenswrapper[9368]: I1203 19:55:34.481828 9368 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 19:55:34.483472 master-0 kubenswrapper[9368]: I1203 19:55:34.483441 9368 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 19:55:34.483918 master-0 kubenswrapper[9368]: I1203 19:55:34.483898 9368 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 19:55:34.486708 master-0 kubenswrapper[9368]: I1203 19:55:34.486678 9368 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.492955 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.492998 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493010 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493017 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493024 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493038 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493045 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493051 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493060 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493069 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493079 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 19:55:34.492993 master-0 kubenswrapper[9368]: I1203 19:55:34.493097 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 19:55:34.493517 master-0 kubenswrapper[9368]: I1203 19:55:34.493120 9368 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 19:55:34.493517 master-0 kubenswrapper[9368]: I1203 19:55:34.493384 9368 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 19:55:34.493616 master-0 kubenswrapper[9368]: I1203 19:55:34.493597 9368 server.go:1280] "Started kubelet" Dec 03 19:55:34.493887 master-0 kubenswrapper[9368]: I1203 19:55:34.493841 9368 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 19:55:34.494036 master-0 kubenswrapper[9368]: I1203 19:55:34.493998 9368 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 19:55:34.495200 master-0 kubenswrapper[9368]: I1203 19:55:34.494520 9368 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 19:55:34.495200 master-0 kubenswrapper[9368]: I1203 19:55:34.494591 9368 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 19:55:34.495200 master-0 kubenswrapper[9368]: I1203 19:55:34.495083 9368 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 19:55:34.495155 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 19:55:34.495991 master-0 kubenswrapper[9368]: I1203 19:55:34.495961 9368 server.go:449] "Adding debug handlers to kubelet server" Dec 03 19:55:34.505493 master-0 kubenswrapper[9368]: I1203 19:55:34.504737 9368 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 19:55:34.505493 master-0 kubenswrapper[9368]: I1203 19:55:34.504783 9368 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 19:55:34.505493 master-0 kubenswrapper[9368]: I1203 19:55:34.505263 9368 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 16:37:11.082545495 +0000 UTC Dec 03 19:55:34.505493 master-0 kubenswrapper[9368]: I1203 19:55:34.505330 9368 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h41m36.577218414s for next certificate rotation Dec 03 19:55:34.505634 master-0 kubenswrapper[9368]: I1203 19:55:34.505613 9368 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 19:55:34.505634 master-0 kubenswrapper[9368]: I1203 19:55:34.505626 9368 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 19:55:34.505743 master-0 kubenswrapper[9368]: I1203 19:55:34.505722 9368 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 19:55:34.506263 master-0 kubenswrapper[9368]: I1203 19:55:34.506243 9368 factory.go:55] Registering systemd factory Dec 03 19:55:34.506349 master-0 kubenswrapper[9368]: I1203 19:55:34.506339 9368 factory.go:221] Registration of the systemd container factory successfully Dec 03 19:55:34.506742 master-0 kubenswrapper[9368]: I1203 19:55:34.506731 9368 factory.go:153] Registering CRI-O factory Dec 03 19:55:34.506836 master-0 kubenswrapper[9368]: I1203 19:55:34.506826 9368 factory.go:221] Registration of the crio container factory successfully Dec 03 19:55:34.506947 master-0 kubenswrapper[9368]: I1203 19:55:34.506937 9368 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 19:55:34.507123 master-0 kubenswrapper[9368]: I1203 19:55:34.507112 9368 factory.go:103] Registering Raw factory Dec 03 19:55:34.507188 master-0 kubenswrapper[9368]: I1203 19:55:34.507181 9368 manager.go:1196] Started watching for new ooms in manager Dec 03 19:55:34.507769 master-0 kubenswrapper[9368]: I1203 19:55:34.507756 9368 manager.go:319] Starting recovery of all containers Dec 03 19:55:34.508151 master-0 kubenswrapper[9368]: I1203 19:55:34.508059 9368 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 19:55:34.510643 master-0 kubenswrapper[9368]: I1203 19:55:34.510582 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" volumeName="kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7" seLinuxMountContext="" Dec 03 19:55:34.510643 master-0 kubenswrapper[9368]: I1203 19:55:34.510641 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46b5d4d0-b841-4e87-84b4-85911ff04325" volumeName="kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510656 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510665 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510696 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510707 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510716 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510726 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.510750 master-0 kubenswrapper[9368]: I1203 19:55:34.510736 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510745 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6eb4700c-6af0-468b-afc8-1e09b902d6bf" volumeName="kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510772 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510801 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a19b8f9e-6299-43bf-9aa5-22071b855773" volumeName="kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510814 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510852 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510865 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510873 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a19b8f9e-6299-43bf-9aa5-22071b855773" volumeName="kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510883 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510935 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510946 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510955 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510963 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba68608f-6b36-455e-b80b-d19237df9312" volumeName="kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510972 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510981 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0c45d22f-1492-47d7-83b6-6dd356a8454d" volumeName="kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.510990 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0c45d22f-1492-47d7-83b6-6dd356a8454d" volumeName="kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.511020 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config" seLinuxMountContext="" Dec 03 19:55:34.511021 master-0 kubenswrapper[9368]: I1203 19:55:34.511030 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511047 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511058 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511069 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511099 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511129 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511139 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511148 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511179 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511189 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511200 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba68608f-6b36-455e-b80b-d19237df9312" volumeName="kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511209 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511220 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511229 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511259 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511268 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511278 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511287 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511296 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511307 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="371917da-b783-4acc-81af-1cfc903269f4" volumeName="kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511337 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511348 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511359 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4316c8d-a1d3-4e51-83cc-d0eecb809924" volumeName="kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511368 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511378 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511423 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511434 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e4f88-7106-4a46-8b63-053345922fb0" volumeName="kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511450 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="830d89af-1266-43ac-b113-990a28595f91" volumeName="kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511463 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511491 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511502 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511512 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511520 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="128ed384-7ab6-41b6-bf45-c8fda917d52f" volumeName="kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511528 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511537 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511547 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511578 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b673cb04-f6f0-4113-bdcd-d6685b942c9f" volumeName="kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch" seLinuxMountContext="" Dec 03 19:55:34.511606 master-0 kubenswrapper[9368]: I1203 19:55:34.511587 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511704 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511737 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5f33153-bff1-403f-ae17-b7e90500365d" volumeName="kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511748 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5f33153-bff1-403f-ae17-b7e90500365d" volumeName="kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511758 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511770 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6eb4700c-6af0-468b-afc8-1e09b902d6bf" volumeName="kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511815 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" volumeName="kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511824 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511832 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511840 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511848 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511857 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511895 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511908 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="371917da-b783-4acc-81af-1cfc903269f4" volumeName="kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511919 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511931 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511967 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.511984 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512009 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512044 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512055 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b673cb04-f6f0-4113-bdcd-d6685b942c9f" volumeName="kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512064 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512073 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512083 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512097 9368 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib" seLinuxMountContext="" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512108 9368 reconstruct.go:97] "Volume reconstruction finished" Dec 03 19:55:34.513032 master-0 kubenswrapper[9368]: I1203 19:55:34.512163 9368 reconciler.go:26] "Reconciler: start to sync state" Dec 03 19:55:34.519315 master-0 kubenswrapper[9368]: I1203 19:55:34.519275 9368 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 19:55:34.540461 master-0 kubenswrapper[9368]: I1203 19:55:34.540407 9368 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 19:55:34.542872 master-0 kubenswrapper[9368]: I1203 19:55:34.542829 9368 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 19:55:34.542872 master-0 kubenswrapper[9368]: I1203 19:55:34.542867 9368 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 19:55:34.542961 master-0 kubenswrapper[9368]: I1203 19:55:34.542888 9368 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 19:55:34.542961 master-0 kubenswrapper[9368]: E1203 19:55:34.542926 9368 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 19:55:34.544442 master-0 kubenswrapper[9368]: I1203 19:55:34.544404 9368 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 19:55:34.550201 master-0 kubenswrapper[9368]: I1203 19:55:34.549381 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/3.log" Dec 03 19:55:34.550614 master-0 kubenswrapper[9368]: I1203 19:55:34.550414 9368 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" exitCode=1 Dec 03 19:55:34.550614 master-0 kubenswrapper[9368]: I1203 19:55:34.550464 9368 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580467 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="b2c2ebffcad93a655874c4b2c0e0dae1edf07cc0c8e231705d220b5fe6aadf15" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580747 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="a396f10beccb65f07ed52d9f7eed56b73ee45537150d1fb69cde98622f0ce32a" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580757 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="3e816effb094becdc3c407acbb3f9f27817216cdbfc7352da3c72fba2c274e3e" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580765 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="5e06cf682588907f65a412d4ac6d4481e139ecf6ab4739442acce6158ba8872d" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580774 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="afd903622e2f7d6d9391f2df58084fdf90b41e4e17808cb5e2d5c792f644b6df" exitCode=0 Dec 03 19:55:34.580771 master-0 kubenswrapper[9368]: I1203 19:55:34.580813 9368 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="9f0406d26b61880d05d604bbabebaeef16d5bda27cf4f4f9e097201539e44456" exitCode=0 Dec 03 19:55:34.584984 master-0 kubenswrapper[9368]: I1203 19:55:34.584937 9368 generic.go:334] "Generic (PLEG): container finished" podID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerID="8bcfa4660c84f8191cb52e8becfb5db2481eb6ba813d896bb3f747ba456753f9" exitCode=0 Dec 03 19:55:34.587237 master-0 kubenswrapper[9368]: I1203 19:55:34.587205 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="ebce70450136604f7c52ead6ab27edb4126b2802849c71ec6e71d90ddadab566" exitCode=0 Dec 03 19:55:34.588142 master-0 kubenswrapper[9368]: I1203 19:55:34.588113 9368 generic.go:334] "Generic (PLEG): container finished" podID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerID="5340fe194bb64dbc3aba205027b00290cb2a1905847a3d137e4cd0dbb4900723" exitCode=0 Dec 03 19:55:34.606979 master-0 kubenswrapper[9368]: I1203 19:55:34.606935 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/0.log" Dec 03 19:55:34.607173 master-0 kubenswrapper[9368]: I1203 19:55:34.606990 9368 generic.go:334] "Generic (PLEG): container finished" podID="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" containerID="8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6" exitCode=1 Dec 03 19:55:34.621902 master-0 kubenswrapper[9368]: I1203 19:55:34.621867 9368 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="431c55fff96bdc81a72543ef7c8b4286f0ecf12b7dc9b0a56daf54373c4eef86" exitCode=0 Dec 03 19:55:34.632278 master-0 kubenswrapper[9368]: I1203 19:55:34.632218 9368 generic.go:334] "Generic (PLEG): container finished" podID="2f618ea7-3ad7-4dce-b450-a8202285f312" containerID="dddd03afbbaf28bd7aa58c27ce415ad910bb5c941f19a9c53d3832794bc71ce3" exitCode=0 Dec 03 19:55:34.633935 master-0 kubenswrapper[9368]: I1203 19:55:34.633891 9368 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6" exitCode=0 Dec 03 19:55:34.643054 master-0 kubenswrapper[9368]: E1203 19:55:34.643012 9368 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 19:55:34.673950 master-0 kubenswrapper[9368]: I1203 19:55:34.673895 9368 manager.go:324] Recovery completed Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724390 9368 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724419 9368 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724440 9368 state_mem.go:36] "Initialized new in-memory state store" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724603 9368 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724617 9368 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724664 9368 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724674 9368 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 03 19:55:34.724965 master-0 kubenswrapper[9368]: I1203 19:55:34.724681 9368 policy_none.go:49] "None policy: Start" Dec 03 19:55:34.726402 master-0 kubenswrapper[9368]: I1203 19:55:34.726117 9368 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 19:55:34.726402 master-0 kubenswrapper[9368]: I1203 19:55:34.726151 9368 state_mem.go:35] "Initializing new in-memory state store" Dec 03 19:55:34.726402 master-0 kubenswrapper[9368]: I1203 19:55:34.726338 9368 state_mem.go:75] "Updated machine memory state" Dec 03 19:55:34.726402 master-0 kubenswrapper[9368]: I1203 19:55:34.726349 9368 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 03 19:55:34.741911 master-0 kubenswrapper[9368]: I1203 19:55:34.741837 9368 manager.go:334] "Starting Device Plugin manager" Dec 03 19:55:34.742073 master-0 kubenswrapper[9368]: I1203 19:55:34.741955 9368 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 19:55:34.742073 master-0 kubenswrapper[9368]: I1203 19:55:34.741976 9368 server.go:79] "Starting device plugin registration server" Dec 03 19:55:34.742524 master-0 kubenswrapper[9368]: I1203 19:55:34.742492 9368 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 19:55:34.742573 master-0 kubenswrapper[9368]: I1203 19:55:34.742512 9368 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 19:55:34.743384 master-0 kubenswrapper[9368]: I1203 19:55:34.743344 9368 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 19:55:34.777816 master-0 kubenswrapper[9368]: I1203 19:55:34.764941 9368 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 19:55:34.777816 master-0 kubenswrapper[9368]: I1203 19:55:34.764968 9368 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 19:55:34.842697 master-0 kubenswrapper[9368]: I1203 19:55:34.842657 9368 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 19:55:34.843840 master-0 kubenswrapper[9368]: I1203 19:55:34.843732 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0"] Dec 03 19:55:34.844150 master-0 kubenswrapper[9368]: I1203 19:55:34.844123 9368 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 19:55:34.844199 master-0 kubenswrapper[9368]: I1203 19:55:34.844154 9368 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 19:55:34.844199 master-0 kubenswrapper[9368]: I1203 19:55:34.844164 9368 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 19:55:34.844265 master-0 kubenswrapper[9368]: I1203 19:55:34.844209 9368 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 19:55:34.844489 master-0 kubenswrapper[9368]: I1203 19:55:34.844328 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"613dced068ceb2df4bbce683ccba9c87ef2fc3f6a3e401852118424ac1bf3a4c"} Dec 03 19:55:34.844551 master-0 kubenswrapper[9368]: I1203 19:55:34.844495 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385"} Dec 03 19:55:34.847762 master-0 kubenswrapper[9368]: I1203 19:55:34.847449 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c"} Dec 03 19:55:34.848477 master-0 kubenswrapper[9368]: I1203 19:55:34.848451 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848514 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205cff62e7c175b9c7d41b9cc295efc2fd405b7085e08d33345741b64c79849c" Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848538 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0" Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848551 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848564 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848576 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848600 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"05747084f9e49c9f0d255ef42ef3e83cd2a8abb1990c562931e3ac0ccc06b877"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848612 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"fc327643e61db9d9337a443f21096010694e550ffc71b3be3921aca847fdd4bd"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848626 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"03f773582fd952a02e3c74054c230118b5ae30a27243d494447b73fc93b2a301"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848645 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"ca335c8e4de4141862b380dce4757695adee236b409b9c589070127007153500"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848656 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"80067c895e8606a9acda897c0ee9b8e4c440d9838ee8d74d86c0a12d51b59462"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848684 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c36de02c52c74fa950885a9f6ca90bd94c0ce1773d696e2cba0138494bdb20" Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848722 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848739 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848755 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerDied","Data":"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6"} Dec 03 19:55:34.848767 master-0 kubenswrapper[9368]: I1203 19:55:34.848771 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382"} Dec 03 19:55:34.854445 master-0 kubenswrapper[9368]: I1203 19:55:34.854420 9368 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 03 19:55:34.854630 master-0 kubenswrapper[9368]: I1203 19:55:34.854489 9368 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 19:55:34.856040 master-0 kubenswrapper[9368]: E1203 19:55:34.856015 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:34.856332 master-0 kubenswrapper[9368]: E1203 19:55:34.856280 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.856332 master-0 kubenswrapper[9368]: E1203 19:55:34.856322 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.856398 master-0 kubenswrapper[9368]: E1203 19:55:34.856379 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:34.856428 master-0 kubenswrapper[9368]: E1203 19:55:34.856407 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920037 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920090 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920120 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920140 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920163 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920185 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920204 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920224 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920245 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920264 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920283 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920301 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920322 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920344 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920368 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920386 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.922845 master-0 kubenswrapper[9368]: I1203 19:55:34.920405 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:34.933265 master-0 kubenswrapper[9368]: W1203 19:55:34.931589 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367c2c7c_1fc8_4608_aa94_b64c6c70cc61.slice/crio-988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016 WatchSource:0}: Error finding container 988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016: Status 404 returned error can't find the container with id 988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016 Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020767 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020832 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020852 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020870 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020889 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020892 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:35.020935 master-0 kubenswrapper[9368]: I1203 19:55:35.020944 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.020879 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.020952 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.020980 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.020911 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.020963 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021059 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021080 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021098 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021112 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021124 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021145 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021170 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021176 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021127 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021226 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021181 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021274 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021290 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021294 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021307 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021323 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021383 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021345 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021361 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021365 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021334 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.021545 master-0 kubenswrapper[9368]: I1203 19:55:35.021448 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:35.482668 master-0 kubenswrapper[9368]: I1203 19:55:35.482572 9368 apiserver.go:52] "Watching apiserver" Dec 03 19:55:35.503313 master-0 kubenswrapper[9368]: I1203 19:55:35.502947 9368 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 19:55:35.504651 master-0 kubenswrapper[9368]: I1203 19:55:35.504562 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw","openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh","openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p","openshift-network-node-identity/network-node-identity-r2kpn","openshift-network-operator/iptables-alerter-72rrb","openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6","openshift-network-diagnostics/network-check-target-x6vwd","openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n","openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg","assisted-installer/assisted-installer-controller-ljsns","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn","openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5","openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5","openshift-multus/multus-p9sdj","openshift-multus/network-metrics-daemon-hs6gf","kube-system/bootstrap-kube-controller-manager-master-0","openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt","openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz","openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl","openshift-network-operator/network-operator-6cbf58c977-w7d8t","openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv","openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7","openshift-controller-manager/controller-manager-56fb5cd58b-cqggq","openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2","openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj","openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw","openshift-multus/multus-additional-cni-plugins-pwlw2","openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q","openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb","openshift-cluster-version/cluster-version-operator-869c786959-zbl42","openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-ovn-kubernetes/ovnkube-node-l9m2r","openshift-service-ca/service-ca-6b8bb995f7-bj4vz","kube-system/bootstrap-kube-scheduler-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx","openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j","openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 19:55:35.504969 master-0 kubenswrapper[9368]: I1203 19:55:35.504922 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 19:55:35.505087 master-0 kubenswrapper[9368]: I1203 19:55:35.505039 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.505200 master-0 kubenswrapper[9368]: I1203 19:55:35.505122 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.506878 master-0 kubenswrapper[9368]: I1203 19:55:35.506829 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:35.507698 master-0 kubenswrapper[9368]: I1203 19:55:35.507331 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 19:55:35.509323 master-0 kubenswrapper[9368]: I1203 19:55:35.509265 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 19:55:35.509584 master-0 kubenswrapper[9368]: I1203 19:55:35.509495 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.510030 master-0 kubenswrapper[9368]: I1203 19:55:35.509955 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 19:55:35.510337 master-0 kubenswrapper[9368]: I1203 19:55:35.510288 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.510509 master-0 kubenswrapper[9368]: I1203 19:55:35.510436 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:35.510609 master-0 kubenswrapper[9368]: I1203 19:55:35.510442 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 19:55:35.510772 master-0 kubenswrapper[9368]: I1203 19:55:35.510735 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.510772 master-0 kubenswrapper[9368]: I1203 19:55:35.510761 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.511031 master-0 kubenswrapper[9368]: I1203 19:55:35.510870 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 19:55:35.511031 master-0 kubenswrapper[9368]: I1203 19:55:35.510979 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 19:55:35.511210 master-0 kubenswrapper[9368]: I1203 19:55:35.511090 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 19:55:35.511631 master-0 kubenswrapper[9368]: I1203 19:55:35.511567 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.512853 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.512887 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.512900 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.512935 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.513099 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.512961 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.513594 master-0 kubenswrapper[9368]: I1203 19:55:35.513359 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.514368 master-0 kubenswrapper[9368]: I1203 19:55:35.513758 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:35.520118 master-0 kubenswrapper[9368]: I1203 19:55:35.520030 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527094 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527312 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527569 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527590 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527752 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527868 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528003 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528072 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528213 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528259 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528611 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528686 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.528845 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529075 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529128 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529334 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529356 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529459 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529477 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-cqggq" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529571 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529599 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529665 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529768 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.529941 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.530131 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.530312 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 19:55:35.530686 master-0 kubenswrapper[9368]: I1203 19:55:35.527996 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.532956 master-0 kubenswrapper[9368]: I1203 19:55:35.532352 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.532956 master-0 kubenswrapper[9368]: I1203 19:55:35.532412 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.532956 master-0 kubenswrapper[9368]: I1203 19:55:35.532439 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.532956 master-0 kubenswrapper[9368]: I1203 19:55:35.532889 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.533452 master-0 kubenswrapper[9368]: I1203 19:55:35.533339 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.533452 master-0 kubenswrapper[9368]: I1203 19:55:35.533410 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.533678 master-0 kubenswrapper[9368]: I1203 19:55:35.533493 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.533678 master-0 kubenswrapper[9368]: I1203 19:55:35.533520 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 19:55:35.533928 master-0 kubenswrapper[9368]: I1203 19:55:35.533892 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 19:55:35.534214 master-0 kubenswrapper[9368]: I1203 19:55:35.534161 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 19:55:35.534333 master-0 kubenswrapper[9368]: I1203 19:55:35.534245 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 19:55:35.535764 master-0 kubenswrapper[9368]: I1203 19:55:35.535640 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:35.536177 master-0 kubenswrapper[9368]: I1203 19:55:35.536095 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.537858 master-0 kubenswrapper[9368]: I1203 19:55:35.537718 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:35.538197 master-0 kubenswrapper[9368]: I1203 19:55:35.535894 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.538474 master-0 kubenswrapper[9368]: I1203 19:55:35.538418 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 19:55:35.538933 master-0 kubenswrapper[9368]: I1203 19:55:35.538860 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 19:55:35.539169 master-0 kubenswrapper[9368]: I1203 19:55:35.539046 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 19:55:35.539513 master-0 kubenswrapper[9368]: I1203 19:55:35.539475 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.539527 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.539908 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.540079 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.539654 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.540397 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 19:55:35.540578 master-0 kubenswrapper[9368]: I1203 19:55:35.540474 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:35.541596 master-0 kubenswrapper[9368]: I1203 19:55:35.540744 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:35.541596 master-0 kubenswrapper[9368]: I1203 19:55:35.540850 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.541596 master-0 kubenswrapper[9368]: I1203 19:55:35.541205 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 19:55:35.541596 master-0 kubenswrapper[9368]: I1203 19:55:35.541448 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.541946 master-0 kubenswrapper[9368]: I1203 19:55:35.541523 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.541946 master-0 kubenswrapper[9368]: I1203 19:55:35.541843 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.542145 master-0 kubenswrapper[9368]: I1203 19:55:35.542091 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.542358 master-0 kubenswrapper[9368]: I1203 19:55:35.542108 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.542358 master-0 kubenswrapper[9368]: I1203 19:55:35.542216 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.542728 master-0 kubenswrapper[9368]: I1203 19:55:35.542669 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 19:55:35.542937 master-0 kubenswrapper[9368]: I1203 19:55:35.542480 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.543181 master-0 kubenswrapper[9368]: I1203 19:55:35.543108 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.543181 master-0 kubenswrapper[9368]: I1203 19:55:35.542785 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 19:55:35.543536 master-0 kubenswrapper[9368]: I1203 19:55:35.538918 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 19:55:35.543536 master-0 kubenswrapper[9368]: I1203 19:55:35.542957 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.543536 master-0 kubenswrapper[9368]: I1203 19:55:35.543234 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 19:55:35.543536 master-0 kubenswrapper[9368]: I1203 19:55:35.543255 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 19:55:35.543536 master-0 kubenswrapper[9368]: I1203 19:55:35.543243 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.543177 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544028 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.543940 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544168 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544183 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544359 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544197 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544211 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544552 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 19:55:35.544656 master-0 kubenswrapper[9368]: I1203 19:55:35.544594 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 19:55:35.546525 master-0 kubenswrapper[9368]: I1203 19:55:35.545126 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.546525 master-0 kubenswrapper[9368]: I1203 19:55:35.545191 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 19:55:35.546525 master-0 kubenswrapper[9368]: I1203 19:55:35.545306 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.546525 master-0 kubenswrapper[9368]: I1203 19:55:35.545862 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 19:55:35.547209 master-0 kubenswrapper[9368]: I1203 19:55:35.547095 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.549226 master-0 kubenswrapper[9368]: I1203 19:55:35.549158 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:35.549226 master-0 kubenswrapper[9368]: I1203 19:55:35.549208 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 19:55:35.549460 master-0 kubenswrapper[9368]: I1203 19:55:35.549229 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.549460 master-0 kubenswrapper[9368]: I1203 19:55:35.549285 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:35.549460 master-0 kubenswrapper[9368]: I1203 19:55:35.549336 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:35.549460 master-0 kubenswrapper[9368]: I1203 19:55:35.549383 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.549460 master-0 kubenswrapper[9368]: I1203 19:55:35.549441 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.549926 master-0 kubenswrapper[9368]: I1203 19:55:35.549740 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-cqggq" Dec 03 19:55:35.549926 master-0 kubenswrapper[9368]: I1203 19:55:35.549854 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.550120 master-0 kubenswrapper[9368]: I1203 19:55:35.549438 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.550227 master-0 kubenswrapper[9368]: I1203 19:55:35.550177 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.550324 master-0 kubenswrapper[9368]: I1203 19:55:35.550239 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.550324 master-0 kubenswrapper[9368]: I1203 19:55:35.550297 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.550499 master-0 kubenswrapper[9368]: I1203 19:55:35.550367 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.550499 master-0 kubenswrapper[9368]: I1203 19:55:35.550419 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.550499 master-0 kubenswrapper[9368]: I1203 19:55:35.550465 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550513 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550560 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550604 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550617 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550736 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550830 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550889 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550942 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.550991 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:35.551026 master-0 kubenswrapper[9368]: I1203 19:55:35.551052 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551106 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551194 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551242 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551249 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551363 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551418 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551446 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551467 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551484 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551515 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551621 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551671 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551726 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551752 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551788 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551870 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551887 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551940 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.552049 master-0 kubenswrapper[9368]: I1203 19:55:35.551991 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552159 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552387 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552517 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552517 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.551770 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552666 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552941 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552955 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.552946 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.553106 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.553499 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 19:55:35.553697 master-0 kubenswrapper[9368]: I1203 19:55:35.553622 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 19:55:35.554651 master-0 kubenswrapper[9368]: I1203 19:55:35.554534 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 19:55:35.555268 master-0 kubenswrapper[9368]: I1203 19:55:35.555213 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 19:55:35.557697 master-0 kubenswrapper[9368]: I1203 19:55:35.557623 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.557937 master-0 kubenswrapper[9368]: I1203 19:55:35.557707 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 19:55:35.558176 master-0 kubenswrapper[9368]: I1203 19:55:35.558123 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 19:55:35.558176 master-0 kubenswrapper[9368]: I1203 19:55:35.558140 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 19:55:35.558361 master-0 kubenswrapper[9368]: I1203 19:55:35.558235 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 19:55:35.559090 master-0 kubenswrapper[9368]: I1203 19:55:35.559056 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 19:55:35.559382 master-0 kubenswrapper[9368]: I1203 19:55:35.559206 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 19:55:35.561488 master-0 kubenswrapper[9368]: I1203 19:55:35.561430 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 19:55:35.562540 master-0 kubenswrapper[9368]: I1203 19:55:35.562486 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 19:55:35.565466 master-0 kubenswrapper[9368]: I1203 19:55:35.565424 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 19:55:35.565858 master-0 kubenswrapper[9368]: I1203 19:55:35.565789 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 19:55:35.566473 master-0 kubenswrapper[9368]: I1203 19:55:35.566410 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 19:55:35.567513 master-0 kubenswrapper[9368]: I1203 19:55:35.567428 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 19:55:35.567742 master-0 kubenswrapper[9368]: I1203 19:55:35.567610 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 19:55:35.568042 master-0 kubenswrapper[9368]: I1203 19:55:35.567947 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 19:55:35.572025 master-0 kubenswrapper[9368]: I1203 19:55:35.571964 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 19:55:35.572660 master-0 kubenswrapper[9368]: I1203 19:55:35.572593 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 19:55:35.573820 master-0 kubenswrapper[9368]: I1203 19:55:35.573715 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:35.575553 master-0 kubenswrapper[9368]: I1203 19:55:35.575505 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 19:55:35.576117 master-0 kubenswrapper[9368]: I1203 19:55:35.576072 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.577943 master-0 kubenswrapper[9368]: I1203 19:55:35.577890 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 19:55:35.580958 master-0 kubenswrapper[9368]: I1203 19:55:35.580897 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 19:55:35.581168 master-0 kubenswrapper[9368]: I1203 19:55:35.580926 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 19:55:35.581369 master-0 kubenswrapper[9368]: I1203 19:55:35.581247 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 19:55:35.581534 master-0 kubenswrapper[9368]: I1203 19:55:35.581511 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 19:55:35.582125 master-0 kubenswrapper[9368]: I1203 19:55:35.582084 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 19:55:35.582425 master-0 kubenswrapper[9368]: I1203 19:55:35.582310 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 19:55:35.582829 master-0 kubenswrapper[9368]: I1203 19:55:35.582727 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.583065 master-0 kubenswrapper[9368]: I1203 19:55:35.582835 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 19:55:35.584706 master-0 kubenswrapper[9368]: I1203 19:55:35.584638 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 19:55:35.586459 master-0 kubenswrapper[9368]: I1203 19:55:35.586379 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 19:55:35.593387 master-0 kubenswrapper[9368]: I1203 19:55:35.593295 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 19:55:35.594952 master-0 kubenswrapper[9368]: I1203 19:55:35.594900 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 19:55:35.595943 master-0 kubenswrapper[9368]: I1203 19:55:35.595883 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 19:55:35.596761 master-0 kubenswrapper[9368]: I1203 19:55:35.596713 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 19:55:35.598438 master-0 kubenswrapper[9368]: I1203 19:55:35.598349 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.598937 master-0 kubenswrapper[9368]: I1203 19:55:35.598770 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 19:55:35.607136 master-0 kubenswrapper[9368]: I1203 19:55:35.607083 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 19:55:35.610011 master-0 kubenswrapper[9368]: I1203 19:55:35.609971 9368 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 19:55:35.627522 master-0 kubenswrapper[9368]: I1203 19:55:35.627492 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 19:55:35.638309 master-0 kubenswrapper[9368]: I1203 19:55:35.638256 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-cqggq" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.649923 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653246 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653284 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653309 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653333 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653355 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653376 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653401 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653421 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653441 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653461 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653479 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653521 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653539 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653564 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653603 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653638 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj5hc\" (UniqueName: \"kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653672 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653703 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653728 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.653753 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: E1203 19:55:35.653944 9368 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: E1203 19:55:35.654113 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.154081309 +0000 UTC m=+1.815331220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.654612 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.654827 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.655104 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.655143 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.655169 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:35.655294 master-0 kubenswrapper[9368]: I1203 19:55:35.655192 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.655712 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.655767 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.655896 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656093 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656132 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656252 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656287 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656327 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656493 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656535 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656565 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656613 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656645 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.657319 master-0 kubenswrapper[9368]: I1203 19:55:35.656711 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:35.657769 master-0 kubenswrapper[9368]: I1203 19:55:35.657528 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.657769 master-0 kubenswrapper[9368]: I1203 19:55:35.657640 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.657769 master-0 kubenswrapper[9368]: I1203 19:55:35.657729 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.658038 master-0 kubenswrapper[9368]: I1203 19:55:35.657920 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.658268 master-0 kubenswrapper[9368]: I1203 19:55:35.658071 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:35.658895 master-0 kubenswrapper[9368]: I1203 19:55:35.658699 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.658895 master-0 kubenswrapper[9368]: E1203 19:55:35.658711 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:35.658895 master-0 kubenswrapper[9368]: E1203 19:55:35.658753 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:35.658895 master-0 kubenswrapper[9368]: E1203 19:55:35.658778 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.158764309 +0000 UTC m=+1.820014220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:35.658895 master-0 kubenswrapper[9368]: E1203 19:55:35.658857 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.15880674 +0000 UTC m=+1.820056651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:35.659433 master-0 kubenswrapper[9368]: I1203 19:55:35.659016 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.659433 master-0 kubenswrapper[9368]: I1203 19:55:35.659275 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.659433 master-0 kubenswrapper[9368]: I1203 19:55:35.659381 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.659433 master-0 kubenswrapper[9368]: I1203 19:55:35.659411 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.659597 master-0 kubenswrapper[9368]: I1203 19:55:35.659441 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.659597 master-0 kubenswrapper[9368]: I1203 19:55:35.659476 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.659597 master-0 kubenswrapper[9368]: I1203 19:55:35.659505 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.659597 master-0 kubenswrapper[9368]: I1203 19:55:35.659536 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:35.659597 master-0 kubenswrapper[9368]: I1203 19:55:35.659568 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659614 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659641 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659667 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659694 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66sr\" (UniqueName: \"kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659733 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.659801 master-0 kubenswrapper[9368]: I1203 19:55:35.659771 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.659821 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.659850 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.659866 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.659874 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.659977 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grk2s\" (UniqueName: \"kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s\") pod \"migrator-5bcf58cf9c-h2w9j\" (UID: \"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 19:55:35.660026 master-0 kubenswrapper[9368]: I1203 19:55:35.660009 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.660434 master-0 kubenswrapper[9368]: I1203 19:55:35.660045 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.660434 master-0 kubenswrapper[9368]: I1203 19:55:35.660065 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.660716 master-0 kubenswrapper[9368]: I1203 19:55:35.660667 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.660906 master-0 kubenswrapper[9368]: I1203 19:55:35.660067 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.660955 master-0 kubenswrapper[9368]: I1203 19:55:35.660919 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.661008 master-0 kubenswrapper[9368]: I1203 19:55:35.660947 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:35.661095 master-0 kubenswrapper[9368]: E1203 19:55:35.661078 9368 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:35.661141 master-0 kubenswrapper[9368]: I1203 19:55:35.661092 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.661141 master-0 kubenswrapper[9368]: I1203 19:55:35.661133 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661221 master-0 kubenswrapper[9368]: I1203 19:55:35.661152 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.661221 master-0 kubenswrapper[9368]: E1203 19:55:35.661206 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.161184199 +0000 UTC m=+1.822434190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:35.661287 master-0 kubenswrapper[9368]: I1203 19:55:35.661247 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.661287 master-0 kubenswrapper[9368]: I1203 19:55:35.661279 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661359 master-0 kubenswrapper[9368]: I1203 19:55:35.661333 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.661390 master-0 kubenswrapper[9368]: I1203 19:55:35.661335 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661452 master-0 kubenswrapper[9368]: I1203 19:55:35.661432 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.661488 master-0 kubenswrapper[9368]: I1203 19:55:35.661453 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.661488 master-0 kubenswrapper[9368]: I1203 19:55:35.661472 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.661542 master-0 kubenswrapper[9368]: I1203 19:55:35.661513 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.661569 master-0 kubenswrapper[9368]: I1203 19:55:35.661543 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.661609 master-0 kubenswrapper[9368]: I1203 19:55:35.661590 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.661643 master-0 kubenswrapper[9368]: I1203 19:55:35.661617 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661671 master-0 kubenswrapper[9368]: E1203 19:55:35.661618 9368 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:35.661671 master-0 kubenswrapper[9368]: I1203 19:55:35.661656 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661739 master-0 kubenswrapper[9368]: E1203 19:55:35.661697 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.161681493 +0000 UTC m=+1.822931494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:35.661770 master-0 kubenswrapper[9368]: I1203 19:55:35.661742 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.661834 master-0 kubenswrapper[9368]: I1203 19:55:35.661769 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.661834 master-0 kubenswrapper[9368]: I1203 19:55:35.661821 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.661892 master-0 kubenswrapper[9368]: I1203 19:55:35.661846 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:35.661892 master-0 kubenswrapper[9368]: I1203 19:55:35.661866 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.661950 master-0 kubenswrapper[9368]: I1203 19:55:35.661893 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.661950 master-0 kubenswrapper[9368]: I1203 19:55:35.661911 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:35.661950 master-0 kubenswrapper[9368]: I1203 19:55:35.661938 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662060 master-0 kubenswrapper[9368]: I1203 19:55:35.661954 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.662060 master-0 kubenswrapper[9368]: I1203 19:55:35.661971 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.662060 master-0 kubenswrapper[9368]: I1203 19:55:35.661986 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.662060 master-0 kubenswrapper[9368]: I1203 19:55:35.662025 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.662060 master-0 kubenswrapper[9368]: I1203 19:55:35.662046 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662074 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662088 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662104 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662252 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662281 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662233 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:35.662327 master-0 kubenswrapper[9368]: I1203 19:55:35.662312 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662819 master-0 kubenswrapper[9368]: I1203 19:55:35.662339 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.662819 master-0 kubenswrapper[9368]: I1203 19:55:35.662352 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.662819 master-0 kubenswrapper[9368]: I1203 19:55:35.662366 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.662819 master-0 kubenswrapper[9368]: I1203 19:55:35.662369 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.662819 master-0 kubenswrapper[9368]: E1203 19:55:35.662318 9368 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:35.663336 master-0 kubenswrapper[9368]: E1203 19:55:35.663311 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.162698748 +0000 UTC m=+1.823948659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:35.667004 master-0 kubenswrapper[9368]: I1203 19:55:35.666977 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 19:55:35.686573 master-0 kubenswrapper[9368]: I1203 19:55:35.686530 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 19:55:35.688211 master-0 kubenswrapper[9368]: I1203 19:55:35.688158 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.690559 master-0 kubenswrapper[9368]: I1203 19:55:35.690513 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.707633 master-0 kubenswrapper[9368]: I1203 19:55:35.707615 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 19:55:35.729257 master-0 kubenswrapper[9368]: I1203 19:55:35.729195 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 19:55:35.733733 master-0 kubenswrapper[9368]: I1203 19:55:35.733602 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.747347 master-0 kubenswrapper[9368]: I1203 19:55:35.747312 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 19:55:35.763724 master-0 kubenswrapper[9368]: I1203 19:55:35.763667 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.763812 master-0 kubenswrapper[9368]: I1203 19:55:35.763755 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.763902 master-0 kubenswrapper[9368]: I1203 19:55:35.763878 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.763902 master-0 kubenswrapper[9368]: I1203 19:55:35.763890 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj5hc\" (UniqueName: \"kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.764174 master-0 kubenswrapper[9368]: I1203 19:55:35.763928 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.764281 master-0 kubenswrapper[9368]: I1203 19:55:35.764225 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.764315 master-0 kubenswrapper[9368]: I1203 19:55:35.764290 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.764343 master-0 kubenswrapper[9368]: E1203 19:55:35.764313 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:35.764343 master-0 kubenswrapper[9368]: I1203 19:55:35.764330 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.764397 master-0 kubenswrapper[9368]: E1203 19:55:35.764380 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.264357251 +0000 UTC m=+1.925607282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:35.764454 master-0 kubenswrapper[9368]: I1203 19:55:35.764436 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.764484 master-0 kubenswrapper[9368]: I1203 19:55:35.764471 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764511 master-0 kubenswrapper[9368]: I1203 19:55:35.764474 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.764511 master-0 kubenswrapper[9368]: I1203 19:55:35.764506 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764566 master-0 kubenswrapper[9368]: I1203 19:55:35.764536 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.764605 master-0 kubenswrapper[9368]: I1203 19:55:35.764588 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:35.764678 master-0 kubenswrapper[9368]: E1203 19:55:35.764639 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:35.764678 master-0 kubenswrapper[9368]: I1203 19:55:35.764657 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764736 master-0 kubenswrapper[9368]: I1203 19:55:35.764634 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764736 master-0 kubenswrapper[9368]: E1203 19:55:35.764719 9368 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:35.764813 master-0 kubenswrapper[9368]: I1203 19:55:35.764652 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764813 master-0 kubenswrapper[9368]: E1203 19:55:35.764769 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.26475729 +0000 UTC m=+1.926007441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:35.764874 master-0 kubenswrapper[9368]: I1203 19:55:35.764816 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.764874 master-0 kubenswrapper[9368]: E1203 19:55:35.764861 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:35.764932 master-0 kubenswrapper[9368]: E1203 19:55:35.764913 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.264884654 +0000 UTC m=+1.926134595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:35.765016 master-0 kubenswrapper[9368]: E1203 19:55:35.764979 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.264944255 +0000 UTC m=+1.926194166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:35.765050 master-0 kubenswrapper[9368]: I1203 19:55:35.765024 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765076 master-0 kubenswrapper[9368]: I1203 19:55:35.765051 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765102 master-0 kubenswrapper[9368]: I1203 19:55:35.765069 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765102 master-0 kubenswrapper[9368]: I1203 19:55:35.765090 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765159 master-0 kubenswrapper[9368]: I1203 19:55:35.765124 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66sr\" (UniqueName: \"kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.765159 master-0 kubenswrapper[9368]: I1203 19:55:35.765138 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765159 master-0 kubenswrapper[9368]: I1203 19:55:35.765145 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:35.765242 master-0 kubenswrapper[9368]: I1203 19:55:35.765173 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765275 master-0 kubenswrapper[9368]: I1203 19:55:35.765233 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765323 master-0 kubenswrapper[9368]: E1203 19:55:35.765262 9368 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:35.765323 master-0 kubenswrapper[9368]: I1203 19:55:35.765318 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765378 master-0 kubenswrapper[9368]: I1203 19:55:35.765327 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765378 master-0 kubenswrapper[9368]: I1203 19:55:35.765257 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765378 master-0 kubenswrapper[9368]: I1203 19:55:35.765327 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765378 master-0 kubenswrapper[9368]: I1203 19:55:35.765277 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.765481 master-0 kubenswrapper[9368]: E1203 19:55:35.765396 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.265378507 +0000 UTC m=+1.926628448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:35.765481 master-0 kubenswrapper[9368]: I1203 19:55:35.765287 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765538 master-0 kubenswrapper[9368]: I1203 19:55:35.765521 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765591 master-0 kubenswrapper[9368]: I1203 19:55:35.765569 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grk2s\" (UniqueName: \"kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s\") pod \"migrator-5bcf58cf9c-h2w9j\" (UID: \"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 19:55:35.765672 master-0 kubenswrapper[9368]: I1203 19:55:35.765651 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765708 master-0 kubenswrapper[9368]: I1203 19:55:35.765685 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.765739 master-0 kubenswrapper[9368]: I1203 19:55:35.765698 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765767 master-0 kubenswrapper[9368]: I1203 19:55:35.765710 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765825 master-0 kubenswrapper[9368]: I1203 19:55:35.765777 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.765825 master-0 kubenswrapper[9368]: I1203 19:55:35.765732 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.765883 master-0 kubenswrapper[9368]: I1203 19:55:35.765634 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.765883 master-0 kubenswrapper[9368]: I1203 19:55:35.765835 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:35.765942 master-0 kubenswrapper[9368]: I1203 19:55:35.765920 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766015 master-0 kubenswrapper[9368]: I1203 19:55:35.765990 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766045 master-0 kubenswrapper[9368]: I1203 19:55:35.766013 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766045 master-0 kubenswrapper[9368]: E1203 19:55:35.765990 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:35.766045 master-0 kubenswrapper[9368]: I1203 19:55:35.766033 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.766124 master-0 kubenswrapper[9368]: I1203 19:55:35.766068 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766124 master-0 kubenswrapper[9368]: I1203 19:55:35.766018 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766124 master-0 kubenswrapper[9368]: E1203 19:55:35.766085 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.266064884 +0000 UTC m=+1.927314785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:35.766208 master-0 kubenswrapper[9368]: I1203 19:55:35.766148 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766208 master-0 kubenswrapper[9368]: I1203 19:55:35.766185 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766208 master-0 kubenswrapper[9368]: I1203 19:55:35.766205 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766222 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766190 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766245 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766256 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766282 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766295 master-0 kubenswrapper[9368]: I1203 19:55:35.766293 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766314 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766333 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: E1203 19:55:35.766345 9368 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: E1203 19:55:35.766371 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.266364571 +0000 UTC m=+1.927614472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766367 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766408 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766429 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766453 master-0 kubenswrapper[9368]: I1203 19:55:35.766455 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766466 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: E1203 19:55:35.766416 9368 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766477 9368 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: E1203 19:55:35.766545 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.266527685 +0000 UTC m=+1.927777596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766521 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766581 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: E1203 19:55:35.766616 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766626 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766641 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766665 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.766681 master-0 kubenswrapper[9368]: I1203 19:55:35.766667 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766686 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: E1203 19:55:35.766730 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:36.266711111 +0000 UTC m=+1.927961062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766725 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766750 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766819 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766819 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766838 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766864 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766883 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766900 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766905 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766932 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766991 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.766995 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.767011 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.767040 master-0 kubenswrapper[9368]: I1203 19:55:35.767055 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.767702 master-0 kubenswrapper[9368]: I1203 19:55:35.767101 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:35.767702 master-0 kubenswrapper[9368]: I1203 19:55:35.767117 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:35.767702 master-0 kubenswrapper[9368]: I1203 19:55:35.767558 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.775769 master-0 kubenswrapper[9368]: I1203 19:55:35.775741 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:35.783495 master-0 kubenswrapper[9368]: I1203 19:55:35.783437 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 19:55:35.808518 master-0 kubenswrapper[9368]: I1203 19:55:35.806157 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 19:55:35.808518 master-0 kubenswrapper[9368]: I1203 19:55:35.806879 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 19:55:35.809157 master-0 kubenswrapper[9368]: I1203 19:55:35.809078 9368 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:55:35.813811 master-0 kubenswrapper[9368]: I1203 19:55:35.813731 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:35.827642 master-0 kubenswrapper[9368]: I1203 19:55:35.827592 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 19:55:35.834927 master-0 kubenswrapper[9368]: I1203 19:55:35.834876 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:35.835973 master-0 kubenswrapper[9368]: I1203 19:55:35.835935 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.848243 master-0 kubenswrapper[9368]: I1203 19:55:35.848210 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 19:55:35.887637 master-0 kubenswrapper[9368]: I1203 19:55:35.887516 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 19:55:35.889196 master-0 kubenswrapper[9368]: I1203 19:55:35.889058 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:35.909061 master-0 kubenswrapper[9368]: I1203 19:55:35.908988 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 19:55:35.914290 master-0 kubenswrapper[9368]: I1203 19:55:35.913532 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:35.945161 master-0 kubenswrapper[9368]: I1203 19:55:35.945114 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 19:55:35.960384 master-0 kubenswrapper[9368]: I1203 19:55:35.960316 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 19:55:35.968548 master-0 kubenswrapper[9368]: I1203 19:55:35.968511 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 19:55:36.021090 master-0 kubenswrapper[9368]: I1203 19:55:36.020949 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:55:36.028643 master-0 kubenswrapper[9368]: I1203 19:55:36.028613 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 19:55:36.049542 master-0 kubenswrapper[9368]: I1203 19:55:36.049493 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:36.070116 master-0 kubenswrapper[9368]: I1203 19:55:36.070071 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 19:55:36.079086 master-0 kubenswrapper[9368]: I1203 19:55:36.079041 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:36.108824 master-0 kubenswrapper[9368]: I1203 19:55:36.108742 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:36.132187 master-0 kubenswrapper[9368]: I1203 19:55:36.132142 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 19:55:36.140275 master-0 kubenswrapper[9368]: I1203 19:55:36.140229 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:36.160357 master-0 kubenswrapper[9368]: I1203 19:55:36.160302 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 19:55:36.172168 master-0 kubenswrapper[9368]: I1203 19:55:36.172121 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:36.172294 master-0 kubenswrapper[9368]: I1203 19:55:36.172215 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:36.172294 master-0 kubenswrapper[9368]: I1203 19:55:36.172250 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:36.172441 master-0 kubenswrapper[9368]: I1203 19:55:36.172399 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:36.172549 master-0 kubenswrapper[9368]: E1203 19:55:36.172466 9368 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:36.172549 master-0 kubenswrapper[9368]: I1203 19:55:36.172487 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:36.172630 master-0 kubenswrapper[9368]: E1203 19:55:36.172504 9368 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:36.172630 master-0 kubenswrapper[9368]: E1203 19:55:36.172582 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:36.172703 master-0 kubenswrapper[9368]: E1203 19:55:36.172512 9368 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:36.172703 master-0 kubenswrapper[9368]: E1203 19:55:36.172646 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.172625701 +0000 UTC m=+2.833875632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:36.172703 master-0 kubenswrapper[9368]: I1203 19:55:36.172586 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:36.172703 master-0 kubenswrapper[9368]: E1203 19:55:36.172655 9368 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:36.172703 master-0 kubenswrapper[9368]: E1203 19:55:36.172690 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.172671892 +0000 UTC m=+2.833921813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:36.172928 master-0 kubenswrapper[9368]: E1203 19:55:36.172712 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.172700333 +0000 UTC m=+2.833950264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:36.172928 master-0 kubenswrapper[9368]: E1203 19:55:36.172662 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:36.172928 master-0 kubenswrapper[9368]: E1203 19:55:36.172733 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.172722654 +0000 UTC m=+2.833972685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:36.172928 master-0 kubenswrapper[9368]: E1203 19:55:36.172857 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.172823416 +0000 UTC m=+2.834073377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:36.173088 master-0 kubenswrapper[9368]: E1203 19:55:36.172995 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.17298223 +0000 UTC m=+2.834232221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:36.179083 master-0 kubenswrapper[9368]: I1203 19:55:36.179044 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 19:55:36.209209 master-0 kubenswrapper[9368]: I1203 19:55:36.209157 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:36.221702 master-0 kubenswrapper[9368]: I1203 19:55:36.221650 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:36.248580 master-0 kubenswrapper[9368]: I1203 19:55:36.248524 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 19:55:36.249216 master-0 kubenswrapper[9368]: I1203 19:55:36.249088 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:36.253957 master-0 kubenswrapper[9368]: I1203 19:55:36.253924 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:36.260099 master-0 kubenswrapper[9368]: I1203 19:55:36.260059 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:36.274586 master-0 kubenswrapper[9368]: I1203 19:55:36.274513 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:36.274586 master-0 kubenswrapper[9368]: I1203 19:55:36.274572 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:36.274696 master-0 kubenswrapper[9368]: I1203 19:55:36.274605 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:36.274696 master-0 kubenswrapper[9368]: I1203 19:55:36.274623 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:36.274696 master-0 kubenswrapper[9368]: I1203 19:55:36.274659 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:36.274696 master-0 kubenswrapper[9368]: E1203 19:55:36.274686 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:36.274892 master-0 kubenswrapper[9368]: I1203 19:55:36.274704 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:36.274892 master-0 kubenswrapper[9368]: E1203 19:55:36.274748 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.274728194 +0000 UTC m=+2.935978115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:36.274892 master-0 kubenswrapper[9368]: E1203 19:55:36.274833 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:36.274892 master-0 kubenswrapper[9368]: I1203 19:55:36.274843 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:36.274892 master-0 kubenswrapper[9368]: E1203 19:55:36.274877 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.274863298 +0000 UTC m=+2.936113209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: I1203 19:55:36.274898 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.274927 9368 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.274959 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.27494958 +0000 UTC m=+2.936199501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.274962 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.274986 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.274978601 +0000 UTC m=+2.936228512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.275012 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.275021 9368 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:36.275034 master-0 kubenswrapper[9368]: E1203 19:55:36.275036 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.275028612 +0000 UTC m=+2.936278543 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275051 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.275044303 +0000 UTC m=+2.936294224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275063 9368 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275080 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.275075173 +0000 UTC m=+2.936325084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: I1203 19:55:36.274928 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275093 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275119 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.275109854 +0000 UTC m=+2.936359775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275141 9368 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:36.275227 master-0 kubenswrapper[9368]: E1203 19:55:36.275163 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:37.275158155 +0000 UTC m=+2.936408066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:36.293268 master-0 kubenswrapper[9368]: E1203 19:55:36.292767 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:55:36.315064 master-0 kubenswrapper[9368]: E1203 19:55:36.314971 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:36.337330 master-0 kubenswrapper[9368]: E1203 19:55:36.336642 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:36.352299 master-0 kubenswrapper[9368]: E1203 19:55:36.352241 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:55:36.375409 master-0 kubenswrapper[9368]: E1203 19:55:36.375341 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 19:55:36.402649 master-0 kubenswrapper[9368]: I1203 19:55:36.402551 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 19:55:36.432661 master-0 kubenswrapper[9368]: I1203 19:55:36.432581 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 19:55:36.450826 master-0 kubenswrapper[9368]: I1203 19:55:36.450745 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:36.462008 master-0 kubenswrapper[9368]: I1203 19:55:36.461963 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 19:55:36.493284 master-0 kubenswrapper[9368]: I1203 19:55:36.493188 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:36.509416 master-0 kubenswrapper[9368]: I1203 19:55:36.509341 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 19:55:36.524098 master-0 kubenswrapper[9368]: I1203 19:55:36.524013 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:36.552533 master-0 kubenswrapper[9368]: I1203 19:55:36.552302 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:36.565816 master-0 kubenswrapper[9368]: I1203 19:55:36.565729 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:36.582499 master-0 kubenswrapper[9368]: I1203 19:55:36.582418 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:36.602999 master-0 kubenswrapper[9368]: I1203 19:55:36.602876 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 19:55:36.620821 master-0 kubenswrapper[9368]: I1203 19:55:36.620690 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 19:55:36.643668 master-0 kubenswrapper[9368]: I1203 19:55:36.643549 9368 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="0651574b36c6a4f52acd96c11c41f938e0a9dc2320440d248364735d4b37969d" exitCode=0 Dec 03 19:55:36.643668 master-0 kubenswrapper[9368]: I1203 19:55:36.643683 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerDied","Data":"0651574b36c6a4f52acd96c11c41f938e0a9dc2320440d248364735d4b37969d"} Dec 03 19:55:36.645383 master-0 kubenswrapper[9368]: I1203 19:55:36.645074 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:36.666421 master-0 kubenswrapper[9368]: I1203 19:55:36.666348 9368 request.go:700] Waited for 1.004168396s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token Dec 03 19:55:36.674859 master-0 kubenswrapper[9368]: I1203 19:55:36.674771 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:36.681320 master-0 kubenswrapper[9368]: I1203 19:55:36.681274 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:36.707936 master-0 kubenswrapper[9368]: I1203 19:55:36.707852 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:36.731507 master-0 kubenswrapper[9368]: I1203 19:55:36.731460 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 19:55:36.734614 master-0 kubenswrapper[9368]: I1203 19:55:36.734575 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj5hc\" (UniqueName: \"kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:36.745528 master-0 kubenswrapper[9368]: I1203 19:55:36.745492 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66sr\" (UniqueName: \"kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:36.764390 master-0 kubenswrapper[9368]: I1203 19:55:36.764353 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grk2s\" (UniqueName: \"kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s\") pod \"migrator-5bcf58cf9c-h2w9j\" (UID: \"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 19:55:36.777572 master-0 kubenswrapper[9368]: I1203 19:55:36.777529 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 19:55:37.033326 master-0 kubenswrapper[9368]: I1203 19:55:37.033255 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 19:55:37.033326 master-0 kubenswrapper[9368]: I1203 19:55:37.033290 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j"] Dec 03 19:55:37.043964 master-0 kubenswrapper[9368]: W1203 19:55:37.043635 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d43df9b_bb29_4581_8cd9_f3b9c0c0e4d9.slice/crio-9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79 WatchSource:0}: Error finding container 9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79: Status 404 returned error can't find the container with id 9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79 Dec 03 19:55:37.184857 master-0 kubenswrapper[9368]: I1203 19:55:37.184807 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: I1203 19:55:37.184867 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: I1203 19:55:37.184904 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: I1203 19:55:37.184939 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: I1203 19:55:37.184982 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: I1203 19:55:37.185002 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:37.185028 master-0 kubenswrapper[9368]: E1203 19:55:37.185003 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185079 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185058847 +0000 UTC m=+4.846308778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185123 9368 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185129 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185172 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185157949 +0000 UTC m=+4.846407860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185215 9368 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185237 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185229041 +0000 UTC m=+4.846478952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:37.185271 master-0 kubenswrapper[9368]: E1203 19:55:37.185253 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185245391 +0000 UTC m=+4.846495302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:37.185579 master-0 kubenswrapper[9368]: E1203 19:55:37.185290 9368 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:37.185579 master-0 kubenswrapper[9368]: E1203 19:55:37.185296 9368 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:37.185579 master-0 kubenswrapper[9368]: E1203 19:55:37.185310 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185304303 +0000 UTC m=+4.846554214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:37.185579 master-0 kubenswrapper[9368]: E1203 19:55:37.185358 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.185327643 +0000 UTC m=+4.846577764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:37.222155 master-0 kubenswrapper[9368]: I1203 19:55:37.221999 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-bj4vz"] Dec 03 19:55:37.222155 master-0 kubenswrapper[9368]: I1203 19:55:37.222102 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:37.227119 master-0 kubenswrapper[9368]: I1203 19:55:37.226946 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288005 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288043 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288078 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288143 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288173 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288210 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288233 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288272 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: I1203 19:55:37.288313 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: E1203 19:55:37.288465 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 19:55:37.288685 master-0 kubenswrapper[9368]: E1203 19:55:37.288524 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.288507525 +0000 UTC m=+4.949757436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "node-tuning-operator-tls" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.288898 9368 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.288951 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.288939195 +0000 UTC m=+4.950189106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.288993 9368 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289013 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289006847 +0000 UTC m=+4.950256758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289047 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289068 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289059078 +0000 UTC m=+4.950308989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289115 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289136 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.28913095 +0000 UTC m=+4.950380861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289168 9368 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289204 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert podName:7ed25861-1328-45e7-922e-37588a0b019c nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289195892 +0000 UTC m=+4.950445803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-vqzdb" (UID: "7ed25861-1328-45e7-922e-37588a0b019c") : secret "performance-addon-operator-webhook-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289240 9368 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289263 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289255603 +0000 UTC m=+4.950505514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:37.289307 master-0 kubenswrapper[9368]: E1203 19:55:37.289303 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:37.290719 master-0 kubenswrapper[9368]: E1203 19:55:37.289325 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289318645 +0000 UTC m=+4.950568556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:37.290719 master-0 kubenswrapper[9368]: E1203 19:55:37.289362 9368 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:37.290719 master-0 kubenswrapper[9368]: E1203 19:55:37.289387 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.289379226 +0000 UTC m=+4.950629147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:37.326637 master-0 kubenswrapper[9368]: I1203 19:55:37.325799 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:37.409997 master-0 kubenswrapper[9368]: I1203 19:55:37.409509 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:37.647642 master-0 kubenswrapper[9368]: I1203 19:55:37.647590 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerStarted","Data":"59561622c420df151d8043e444eaec7dca0c22e244b1a6ac8880f20fe809e5c4"} Dec 03 19:55:37.647642 master-0 kubenswrapper[9368]: I1203 19:55:37.647630 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerStarted","Data":"0d31ad42fdaaa8d9f4506f72df0676530f77957571a46716dc1e834dfef43d2c"} Dec 03 19:55:37.649034 master-0 kubenswrapper[9368]: I1203 19:55:37.648998 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016"} Dec 03 19:55:37.649895 master-0 kubenswrapper[9368]: I1203 19:55:37.649849 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" event={"ID":"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9","Type":"ContainerStarted","Data":"9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79"} Dec 03 19:55:37.997907 master-0 kubenswrapper[9368]: I1203 19:55:37.996467 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-cqggq"] Dec 03 19:55:38.005699 master-0 kubenswrapper[9368]: I1203 19:55:38.005551 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-cqggq"] Dec 03 19:55:38.607815 master-0 kubenswrapper[9368]: I1203 19:55:38.607449 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:38.607815 master-0 kubenswrapper[9368]: I1203 19:55:38.607621 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:38.652728 master-0 kubenswrapper[9368]: I1203 19:55:38.652693 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:38.652728 master-0 kubenswrapper[9368]: I1203 19:55:38.652723 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:38.653295 master-0 kubenswrapper[9368]: I1203 19:55:38.652750 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:38.676022 master-0 kubenswrapper[9368]: I1203 19:55:38.675977 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:38.681118 master-0 kubenswrapper[9368]: I1203 19:55:38.681089 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886225 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-768d5b868-82c4q"] Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: E1203 19:55:38.886387 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886400 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: E1203 19:55:38.886410 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886418 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886507 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ccc53d-803e-4d7d-a9b0-6cd604e7907a" containerName="prober" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886524 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 19:55:38.886920 master-0 kubenswrapper[9368]: I1203 19:55:38.886838 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:38.889420 master-0 kubenswrapper[9368]: I1203 19:55:38.889369 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 19:55:38.889497 master-0 kubenswrapper[9368]: I1203 19:55:38.889451 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 19:55:38.889497 master-0 kubenswrapper[9368]: I1203 19:55:38.889474 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 19:55:38.889683 master-0 kubenswrapper[9368]: I1203 19:55:38.889599 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 19:55:38.889683 master-0 kubenswrapper[9368]: I1203 19:55:38.889625 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 19:55:38.905001 master-0 kubenswrapper[9368]: I1203 19:55:38.899711 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-768d5b868-82c4q"] Dec 03 19:55:38.914002 master-0 kubenswrapper[9368]: I1203 19:55:38.913770 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 19:55:39.009243 master-0 kubenswrapper[9368]: I1203 19:55:39.009179 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.009243 master-0 kubenswrapper[9368]: I1203 19:55:39.009236 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.009471 master-0 kubenswrapper[9368]: I1203 19:55:39.009271 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.009471 master-0 kubenswrapper[9368]: I1203 19:55:39.009336 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.009471 master-0 kubenswrapper[9368]: I1203 19:55:39.009391 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pczf\" (UniqueName: \"kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.109970 master-0 kubenswrapper[9368]: I1203 19:55:39.109923 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.110160 master-0 kubenswrapper[9368]: I1203 19:55:39.110023 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.110160 master-0 kubenswrapper[9368]: I1203 19:55:39.110112 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pczf\" (UniqueName: \"kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.110223 master-0 kubenswrapper[9368]: I1203 19:55:39.110200 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.110270 master-0 kubenswrapper[9368]: I1203 19:55:39.110243 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.110411 master-0 kubenswrapper[9368]: E1203 19:55:39.110379 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:39.110489 master-0 kubenswrapper[9368]: E1203 19:55:39.110464 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.610432421 +0000 UTC m=+5.271682342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : configmap "client-ca" not found Dec 03 19:55:39.110920 master-0 kubenswrapper[9368]: E1203 19:55:39.110898 9368 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:39.110982 master-0 kubenswrapper[9368]: E1203 19:55:39.110950 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:39.610935914 +0000 UTC m=+5.272185845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : secret "serving-cert" not found Dec 03 19:55:39.111432 master-0 kubenswrapper[9368]: I1203 19:55:39.111390 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.112739 master-0 kubenswrapper[9368]: I1203 19:55:39.112704 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.144859 master-0 kubenswrapper[9368]: I1203 19:55:39.137669 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pczf\" (UniqueName: \"kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.211757 master-0 kubenswrapper[9368]: I1203 19:55:39.211701 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:39.211757 master-0 kubenswrapper[9368]: I1203 19:55:39.211755 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: I1203 19:55:39.211810 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: I1203 19:55:39.211830 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: I1203 19:55:39.211878 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: I1203 19:55:39.211903 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.211934 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.212003 9368 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.212014 9368 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.212014 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.212040 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212012701 +0000 UTC m=+8.873262632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:39.212168 master-0 kubenswrapper[9368]: E1203 19:55:39.212003 9368 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212183 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls podName:5decce88-c71e-411c-87b5-a37dd0f77e7b nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212153504 +0000 UTC m=+8.873403445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-59j4p" (UID: "5decce88-c71e-411c-87b5-a37dd0f77e7b") : secret "image-registry-operator-tls" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212206 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls podName:128ed384-7ab6-41b6-bf45-c8fda917d52f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212194535 +0000 UTC m=+8.873444556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls") pod "dns-operator-6b7bcd6566-4wcq2" (UID: "128ed384-7ab6-41b6-bf45-c8fda917d52f") : secret "metrics-tls" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212133 9368 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212224 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212217096 +0000 UTC m=+8.873467147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212247 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212236646 +0000 UTC m=+8.873486567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:39.212544 master-0 kubenswrapper[9368]: E1203 19:55:39.212268 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.212258237 +0000 UTC m=+8.873508278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:39.312603 master-0 kubenswrapper[9368]: I1203 19:55:39.312521 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:39.312603 master-0 kubenswrapper[9368]: I1203 19:55:39.312565 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:39.312603 master-0 kubenswrapper[9368]: I1203 19:55:39.312585 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: I1203 19:55:39.312633 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: I1203 19:55:39.312656 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: I1203 19:55:39.312682 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: I1203 19:55:39.312697 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: I1203 19:55:39.312725 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: E1203 19:55:39.312741 9368 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 19:55:39.313002 master-0 kubenswrapper[9368]: E1203 19:55:39.312846 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls podName:3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.312825331 +0000 UTC m=+8.974075242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls") pod "ingress-operator-85dbd94574-l7bzj" (UID: "3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf") : secret "metrics-tls" not found Dec 03 19:55:39.313421 master-0 kubenswrapper[9368]: I1203 19:55:39.312752 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:39.313647 master-0 kubenswrapper[9368]: E1203 19:55:39.313605 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313648 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313653 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313641873 +0000 UTC m=+8.974891894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313700 9368 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313711 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313694814 +0000 UTC m=+8.974944725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313614 9368 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:39.313721 master-0 kubenswrapper[9368]: E1203 19:55:39.313723 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313717214 +0000 UTC m=+8.974967125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:39.314110 master-0 kubenswrapper[9368]: E1203 19:55:39.313735 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313729855 +0000 UTC m=+8.974979766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:39.314110 master-0 kubenswrapper[9368]: E1203 19:55:39.313765 9368 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:39.314110 master-0 kubenswrapper[9368]: E1203 19:55:39.313804 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert podName:0c45d22f-1492-47d7-83b6-6dd356a8454d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313797966 +0000 UTC m=+8.975047867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert") pod "cluster-version-operator-869c786959-zbl42" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d") : secret "cluster-version-operator-serving-cert" not found Dec 03 19:55:39.314110 master-0 kubenswrapper[9368]: E1203 19:55:39.313816 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:39.314110 master-0 kubenswrapper[9368]: E1203 19:55:39.313911 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:43.313882219 +0000 UTC m=+8.975132170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:39.331991 master-0 kubenswrapper[9368]: I1203 19:55:39.317549 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:39.331991 master-0 kubenswrapper[9368]: I1203 19:55:39.317886 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:39.412296 master-0 kubenswrapper[9368]: I1203 19:55:39.412192 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 19:55:39.617311 master-0 kubenswrapper[9368]: I1203 19:55:39.617188 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.617607 master-0 kubenswrapper[9368]: I1203 19:55:39.617364 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:39.617607 master-0 kubenswrapper[9368]: E1203 19:55:39.617404 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:39.617607 master-0 kubenswrapper[9368]: E1203 19:55:39.617507 9368 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:39.617607 master-0 kubenswrapper[9368]: E1203 19:55:39.617564 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:40.617536331 +0000 UTC m=+6.278786282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : configmap "client-ca" not found Dec 03 19:55:39.617607 master-0 kubenswrapper[9368]: E1203 19:55:39.617592 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:40.617580072 +0000 UTC m=+6.278830013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : secret "serving-cert" not found Dec 03 19:55:39.657991 master-0 kubenswrapper[9368]: I1203 19:55:39.657936 9368 generic.go:334] "Generic (PLEG): container finished" podID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerID="0a8f6a401bc81d9be5a9cf7156ef428d64e4a5a0d08e4c992efc6ddc65d0a9c3" exitCode=0 Dec 03 19:55:39.658675 master-0 kubenswrapper[9368]: I1203 19:55:39.658019 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerDied","Data":"0a8f6a401bc81d9be5a9cf7156ef428d64e4a5a0d08e4c992efc6ddc65d0a9c3"} Dec 03 19:55:39.658825 master-0 kubenswrapper[9368]: I1203 19:55:39.658741 9368 scope.go:117] "RemoveContainer" containerID="0a8f6a401bc81d9be5a9cf7156ef428d64e4a5a0d08e4c992efc6ddc65d0a9c3" Dec 03 19:55:40.642983 master-0 kubenswrapper[9368]: I1203 19:55:40.642881 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:40.642983 master-0 kubenswrapper[9368]: I1203 19:55:40.642980 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:40.643266 master-0 kubenswrapper[9368]: E1203 19:55:40.643029 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:40.643266 master-0 kubenswrapper[9368]: E1203 19:55:40.643107 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:42.64308564 +0000 UTC m=+8.304335561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : configmap "client-ca" not found Dec 03 19:55:40.646764 master-0 kubenswrapper[9368]: I1203 19:55:40.646709 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:40.888976 master-0 kubenswrapper[9368]: I1203 19:55:40.888135 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb"] Dec 03 19:55:40.949923 master-0 kubenswrapper[9368]: I1203 19:55:40.949870 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:40.950083 master-0 kubenswrapper[9368]: I1203 19:55:40.950066 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:40.950120 master-0 kubenswrapper[9368]: I1203 19:55:40.950086 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:40.979883 master-0 kubenswrapper[9368]: I1203 19:55:40.979823 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:55:41.665744 master-0 kubenswrapper[9368]: I1203 19:55:41.665690 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerStarted","Data":"52062cf7e28f06e4b78d834f54e665243402b015a9d5ef15880a1512af2a4c43"} Dec 03 19:55:41.667500 master-0 kubenswrapper[9368]: I1203 19:55:41.667455 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23"} Dec 03 19:55:41.668527 master-0 kubenswrapper[9368]: I1203 19:55:41.668487 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" event={"ID":"7ed25861-1328-45e7-922e-37588a0b019c","Type":"ContainerStarted","Data":"2ae6841c89bd0bc9cfc6015de7cc1e3a4bbed5c62b59fd91032790f9ed1aaac0"} Dec 03 19:55:41.670007 master-0 kubenswrapper[9368]: I1203 19:55:41.669978 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b"} Dec 03 19:55:41.672022 master-0 kubenswrapper[9368]: I1203 19:55:41.671986 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" event={"ID":"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9","Type":"ContainerStarted","Data":"40dc2c72189e3bef675966365b4c51f759bc53fe05a0248a9eafcd01222f9890"} Dec 03 19:55:41.672022 master-0 kubenswrapper[9368]: I1203 19:55:41.672013 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" event={"ID":"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9","Type":"ContainerStarted","Data":"01e90aa91324a3b7e384c69f22c6b74b9146251b4abacc9da0b7979f70f1db5a"} Dec 03 19:55:41.673975 master-0 kubenswrapper[9368]: I1203 19:55:41.673937 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"d3dcff6d3aa1b038077193f459470aa3ca6e3833e6b52e5e7c49c67633f191e1"} Dec 03 19:55:41.674046 master-0 kubenswrapper[9368]: I1203 19:55:41.673900 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:41.779893 master-0 kubenswrapper[9368]: I1203 19:55:41.777507 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" podStartSLOduration=4.161205441 podStartE2EDuration="7.777489415s" podCreationTimestamp="2025-12-03 19:55:34 +0000 UTC" firstStartedPulling="2025-12-03 19:55:37.045610744 +0000 UTC m=+2.706860655" lastFinishedPulling="2025-12-03 19:55:40.661894708 +0000 UTC m=+6.323144629" observedRunningTime="2025-12-03 19:55:41.751367441 +0000 UTC m=+7.412617382" watchObservedRunningTime="2025-12-03 19:55:41.777489415 +0000 UTC m=+7.438739326" Dec 03 19:55:41.918868 master-0 kubenswrapper[9368]: I1203 19:55:41.918236 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:41.918868 master-0 kubenswrapper[9368]: I1203 19:55:41.918463 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:41.928874 master-0 kubenswrapper[9368]: I1203 19:55:41.925278 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:42.667130 master-0 kubenswrapper[9368]: I1203 19:55:42.666662 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:42.667534 master-0 kubenswrapper[9368]: E1203 19:55:42.666897 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:42.667534 master-0 kubenswrapper[9368]: E1203 19:55:42.667512 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:46.66749181 +0000 UTC m=+12.328741721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : configmap "client-ca" not found Dec 03 19:55:42.679177 master-0 kubenswrapper[9368]: I1203 19:55:42.679148 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/0.log" Dec 03 19:55:42.679589 master-0 kubenswrapper[9368]: I1203 19:55:42.679558 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="d3dcff6d3aa1b038077193f459470aa3ca6e3833e6b52e5e7c49c67633f191e1" exitCode=255 Dec 03 19:55:42.679660 master-0 kubenswrapper[9368]: I1203 19:55:42.679599 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"d3dcff6d3aa1b038077193f459470aa3ca6e3833e6b52e5e7c49c67633f191e1"} Dec 03 19:55:42.679844 master-0 kubenswrapper[9368]: I1203 19:55:42.679821 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:55:42.680312 master-0 kubenswrapper[9368]: I1203 19:55:42.680290 9368 scope.go:117] "RemoveContainer" containerID="d3dcff6d3aa1b038077193f459470aa3ca6e3833e6b52e5e7c49c67633f191e1" Dec 03 19:55:42.684963 master-0 kubenswrapper[9368]: I1203 19:55:42.684922 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:42.844914 master-0 kubenswrapper[9368]: I1203 19:55:42.844858 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:42.849358 master-0 kubenswrapper[9368]: I1203 19:55:42.849334 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 19:55:43.276247 master-0 kubenswrapper[9368]: I1203 19:55:43.276182 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:43.276247 master-0 kubenswrapper[9368]: I1203 19:55:43.276251 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: I1203 19:55:43.276290 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: I1203 19:55:43.276311 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276345 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276428 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert podName:0d4e4f88-7106-4a46-8b63-053345922fb0 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.276407316 +0000 UTC m=+16.937657237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-pqz7q" (UID: "0d4e4f88-7106-4a46-8b63-053345922fb0") : secret "package-server-manager-serving-cert" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276452 9368 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276530 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls podName:ba68608f-6b36-455e-b80b-d19237df9312 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.276516399 +0000 UTC m=+16.937766310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-dhgcv" (UID: "ba68608f-6b36-455e-b80b-d19237df9312") : secret "cluster-monitoring-operator-tls" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276565 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: E1203 19:55:43.276584 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert podName:a19b8f9e-6299-43bf-9aa5-22071b855773 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.276578111 +0000 UTC m=+16.937828022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert") pod "olm-operator-76bd5d69c7-wg7fw" (UID: "a19b8f9e-6299-43bf-9aa5-22071b855773") : secret "olm-operator-serving-cert" not found Dec 03 19:55:43.276773 master-0 kubenswrapper[9368]: I1203 19:55:43.276359 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:43.277057 master-0 kubenswrapper[9368]: I1203 19:55:43.276869 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:43.277057 master-0 kubenswrapper[9368]: E1203 19:55:43.277022 9368 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 19:55:43.277057 master-0 kubenswrapper[9368]: E1203 19:55:43.277056 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics podName:b673cb04-f6f0-4113-bdcd-d6685b942c9f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.277046073 +0000 UTC m=+16.938295994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-xqvv6" (UID: "b673cb04-f6f0-4113-bdcd-d6685b942c9f") : secret "marketplace-operator-metrics" not found Dec 03 19:55:43.282553 master-0 kubenswrapper[9368]: I1203 19:55:43.282513 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:43.282553 master-0 kubenswrapper[9368]: I1203 19:55:43.282519 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:43.311037 master-0 kubenswrapper[9368]: I1203 19:55:43.310969 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 19:55:43.311572 master-0 kubenswrapper[9368]: I1203 19:55:43.311541 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.382989 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383051 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383090 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383139 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383162 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383187 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: I1203 19:55:43.383245 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383359 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383371 9368 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383417 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.383398985 +0000 UTC m=+17.044648896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383434 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs podName:b4316c8d-a1d3-4e51-83cc-d0eecb809924 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.383426425 +0000 UTC m=+17.044676336 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs") pod "multus-admission-controller-78ddcf56f9-nqn2j" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924") : secret "multus-admission-controller-secret" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383484 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383510 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.383501787 +0000 UTC m=+17.044751698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383852 9368 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.383950 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs podName:46b5d4d0-b841-4e87-84b4-85911ff04325 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.383923798 +0000 UTC m=+17.045173739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs") pod "network-metrics-daemon-hs6gf" (UID: "46b5d4d0-b841-4e87-84b4-85911ff04325") : secret "metrics-daemon-secret" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.384041 9368 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Dec 03 19:55:43.384260 master-0 kubenswrapper[9368]: E1203 19:55:43.384078 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert podName:d5f33153-bff1-403f-ae17-b7e90500365d nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.384066001 +0000 UTC m=+17.045315942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert") pod "catalog-operator-7cf5cf757f-25z8n" (UID: "d5f33153-bff1-403f-ae17-b7e90500365d") : secret "catalog-operator-serving-cert" not found Dec 03 19:55:43.387495 master-0 kubenswrapper[9368]: I1203 19:55:43.387461 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:43.388319 master-0 kubenswrapper[9368]: I1203 19:55:43.388278 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"cluster-version-operator-869c786959-zbl42\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:43.520566 master-0 kubenswrapper[9368]: I1203 19:55:43.517577 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p"] Dec 03 19:55:43.568405 master-0 kubenswrapper[9368]: I1203 19:55:43.568361 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2"] Dec 03 19:55:43.612065 master-0 kubenswrapper[9368]: I1203 19:55:43.611999 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:55:43.613657 master-0 kubenswrapper[9368]: I1203 19:55:43.613624 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 19:55:43.633911 master-0 kubenswrapper[9368]: W1203 19:55:43.633871 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c45d22f_1492_47d7_83b6_6dd356a8454d.slice/crio-04c8eaa274e2cca2857aa142579311ee009454f560d3839f6a387b3b67a5bfe1 WatchSource:0}: Error finding container 04c8eaa274e2cca2857aa142579311ee009454f560d3839f6a387b3b67a5bfe1: Status 404 returned error can't find the container with id 04c8eaa274e2cca2857aa142579311ee009454f560d3839f6a387b3b67a5bfe1 Dec 03 19:55:43.685215 master-0 kubenswrapper[9368]: I1203 19:55:43.685161 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" event={"ID":"0c45d22f-1492-47d7-83b6-6dd356a8454d","Type":"ContainerStarted","Data":"04c8eaa274e2cca2857aa142579311ee009454f560d3839f6a387b3b67a5bfe1"} Dec 03 19:55:43.688520 master-0 kubenswrapper[9368]: I1203 19:55:43.688480 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" event={"ID":"5decce88-c71e-411c-87b5-a37dd0f77e7b","Type":"ContainerStarted","Data":"dbe65295e2c898be586dca5d88680f9b16d8f0721a6e9ed04f2477053779cf26"} Dec 03 19:55:43.689965 master-0 kubenswrapper[9368]: I1203 19:55:43.689942 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" event={"ID":"128ed384-7ab6-41b6-bf45-c8fda917d52f","Type":"ContainerStarted","Data":"52e10ffcc1fbdf8f2cb9d16e424d95ecef32b76b41b9a925005182a3b5446923"} Dec 03 19:55:43.692409 master-0 kubenswrapper[9368]: I1203 19:55:43.692373 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/0.log" Dec 03 19:55:43.692865 master-0 kubenswrapper[9368]: I1203 19:55:43.692729 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"2dd513c4c7700ec665cd85658968cfa47ab585f4855779f0285e2f319e1b23ec"} Dec 03 19:55:43.693915 master-0 kubenswrapper[9368]: I1203 19:55:43.693868 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:43.757494 master-0 kubenswrapper[9368]: I1203 19:55:43.757444 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj"] Dec 03 19:55:43.766663 master-0 kubenswrapper[9368]: W1203 19:55:43.766631 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f69a3c7_cb00_4f28_b1e7_52bcdb53fbbf.slice/crio-b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf WatchSource:0}: Error finding container b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf: Status 404 returned error can't find the container with id b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf Dec 03 19:55:44.696871 master-0 kubenswrapper[9368]: I1203 19:55:44.696767 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf"} Dec 03 19:55:45.458219 master-0 kubenswrapper[9368]: I1203 19:55:45.456675 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-86f4cd54cb-7c5dq"] Dec 03 19:55:45.458509 master-0 kubenswrapper[9368]: I1203 19:55:45.458238 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.463666 master-0 kubenswrapper[9368]: I1203 19:55:45.462993 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.464772 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.465233 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.465578 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.466205 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.466227 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.466445 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.466630 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 19:55:45.468293 master-0 kubenswrapper[9368]: I1203 19:55:45.467114 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 19:55:45.478436 master-0 kubenswrapper[9368]: I1203 19:55:45.478020 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 19:55:45.480836 master-0 kubenswrapper[9368]: I1203 19:55:45.479985 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-86f4cd54cb-7c5dq"] Dec 03 19:55:45.507101 master-0 kubenswrapper[9368]: I1203 19:55:45.507059 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507337 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507388 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507415 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507438 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8m2\" (UniqueName: \"kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507468 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507503 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507575 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507599 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507620 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.507742 master-0 kubenswrapper[9368]: I1203 19:55:45.507696 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.524877 master-0 kubenswrapper[9368]: I1203 19:55:45.522202 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:55:45.608641 master-0 kubenswrapper[9368]: I1203 19:55:45.608602 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608641 master-0 kubenswrapper[9368]: I1203 19:55:45.608659 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608690 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608721 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608737 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608754 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8m2\" (UniqueName: \"kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608794 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608826 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608844 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608861 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.608919 master-0 kubenswrapper[9368]: I1203 19:55:45.608876 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.609325 master-0 kubenswrapper[9368]: E1203 19:55:45.608972 9368 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 19:55:45.609325 master-0 kubenswrapper[9368]: E1203 19:55:45.609022 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:46.109007125 +0000 UTC m=+11.770257046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : configmap "audit-0" not found Dec 03 19:55:45.609849 master-0 kubenswrapper[9368]: E1203 19:55:45.609561 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:45.609849 master-0 kubenswrapper[9368]: I1203 19:55:45.609646 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.609849 master-0 kubenswrapper[9368]: E1203 19:55:45.609658 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:46.109634341 +0000 UTC m=+11.770884352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "serving-cert" not found Dec 03 19:55:45.609849 master-0 kubenswrapper[9368]: I1203 19:55:45.609642 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.609849 master-0 kubenswrapper[9368]: E1203 19:55:45.609763 9368 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Dec 03 19:55:45.610153 master-0 kubenswrapper[9368]: E1203 19:55:45.609865 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:46.109844696 +0000 UTC m=+11.771094617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "etcd-client" not found Dec 03 19:55:45.610840 master-0 kubenswrapper[9368]: I1203 19:55:45.610792 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.610903 master-0 kubenswrapper[9368]: I1203 19:55:45.610838 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.611263 master-0 kubenswrapper[9368]: I1203 19:55:45.611222 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.611501 master-0 kubenswrapper[9368]: I1203 19:55:45.611468 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.614930 master-0 kubenswrapper[9368]: I1203 19:55:45.614892 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:45.633213 master-0 kubenswrapper[9368]: I1203 19:55:45.633139 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8m2\" (UniqueName: \"kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: I1203 19:55:46.115045 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: I1203 19:55:46.115188 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: I1203 19:55:46.115227 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115241 9368 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115376 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:47.115349276 +0000 UTC m=+12.776599247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "etcd-client" not found Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115462 9368 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115521 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115532 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:47.11551103 +0000 UTC m=+12.776761011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : configmap "audit-0" not found Dec 03 19:55:46.115823 master-0 kubenswrapper[9368]: E1203 19:55:46.115631 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:47.115606462 +0000 UTC m=+12.776856393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "serving-cert" not found Dec 03 19:55:46.363482 master-0 kubenswrapper[9368]: I1203 19:55:46.362101 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-l789w"] Dec 03 19:55:46.363482 master-0 kubenswrapper[9368]: I1203 19:55:46.363149 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419685 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419723 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419741 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8fx\" (UniqueName: \"kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419760 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419795 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419834 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419884 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419908 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419942 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419968 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.419986 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.420019 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.420035 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.420277 master-0 kubenswrapper[9368]: I1203 19:55:46.420056 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520703 master-0 kubenswrapper[9368]: I1203 19:55:46.520603 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520703 master-0 kubenswrapper[9368]: I1203 19:55:46.520654 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520703 master-0 kubenswrapper[9368]: I1203 19:55:46.520672 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520712 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520735 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520762 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520800 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520815 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520831 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8fx\" (UniqueName: \"kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520846 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520862 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520891 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.520951 master-0 kubenswrapper[9368]: I1203 19:55:46.520939 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.521204 master-0 kubenswrapper[9368]: I1203 19:55:46.520961 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.521204 master-0 kubenswrapper[9368]: I1203 19:55:46.521060 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.521255 master-0 kubenswrapper[9368]: I1203 19:55:46.521228 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.521283 master-0 kubenswrapper[9368]: I1203 19:55:46.521256 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521413 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521483 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521510 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521628 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521666 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521756 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.521892 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.522051 master-0 kubenswrapper[9368]: I1203 19:55:46.522010 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.524836 master-0 kubenswrapper[9368]: I1203 19:55:46.524660 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.525939 master-0 kubenswrapper[9368]: I1203 19:55:46.525918 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.537530 master-0 kubenswrapper[9368]: I1203 19:55:46.537492 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8fx\" (UniqueName: \"kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.686215 master-0 kubenswrapper[9368]: I1203 19:55:46.686134 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 19:55:46.702181 master-0 kubenswrapper[9368]: W1203 19:55:46.702131 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7171597_cb9a_451c_80a4_64cfccf885f0.slice/crio-60f0cb36c47fd9046d1059d86dff60d9f8e02831ea35b7e68a92310a9b8dd92a WatchSource:0}: Error finding container 60f0cb36c47fd9046d1059d86dff60d9f8e02831ea35b7e68a92310a9b8dd92a: Status 404 returned error can't find the container with id 60f0cb36c47fd9046d1059d86dff60d9f8e02831ea35b7e68a92310a9b8dd92a Dec 03 19:55:46.705011 master-0 kubenswrapper[9368]: I1203 19:55:46.704404 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" event={"ID":"7ed25861-1328-45e7-922e-37588a0b019c","Type":"ContainerStarted","Data":"b15d5b3401a95a50f5c18b6410300731cd922d460a927b29c822856e4c00523b"} Dec 03 19:55:46.707939 master-0 kubenswrapper[9368]: I1203 19:55:46.707892 9368 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="52062cf7e28f06e4b78d834f54e665243402b015a9d5ef15880a1512af2a4c43" exitCode=0 Dec 03 19:55:46.707990 master-0 kubenswrapper[9368]: I1203 19:55:46.707952 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerDied","Data":"52062cf7e28f06e4b78d834f54e665243402b015a9d5ef15880a1512af2a4c43"} Dec 03 19:55:46.709246 master-0 kubenswrapper[9368]: I1203 19:55:46.708317 9368 scope.go:117] "RemoveContainer" containerID="52062cf7e28f06e4b78d834f54e665243402b015a9d5ef15880a1512af2a4c43" Dec 03 19:55:46.709900 master-0 kubenswrapper[9368]: I1203 19:55:46.709644 9368 generic.go:334] "Generic (PLEG): container finished" podID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerID="86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23" exitCode=0 Dec 03 19:55:46.709900 master-0 kubenswrapper[9368]: I1203 19:55:46.709720 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerDied","Data":"86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23"} Dec 03 19:55:46.709900 master-0 kubenswrapper[9368]: I1203 19:55:46.709795 9368 scope.go:117] "RemoveContainer" containerID="0a8f6a401bc81d9be5a9cf7156ef428d64e4a5a0d08e4c992efc6ddc65d0a9c3" Dec 03 19:55:46.710129 master-0 kubenswrapper[9368]: I1203 19:55:46.710050 9368 scope.go:117] "RemoveContainer" containerID="86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23" Dec 03 19:55:46.710224 master-0 kubenswrapper[9368]: E1203 19:55:46.710197 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-mqpzf_openshift-etcd-operator(78a864f2-934f-4197-9753-24c9bc7f1fca)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" Dec 03 19:55:46.722336 master-0 kubenswrapper[9368]: I1203 19:55:46.722300 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") pod \"controller-manager-768d5b868-82c4q\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:46.722453 master-0 kubenswrapper[9368]: E1203 19:55:46.722432 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:46.722828 master-0 kubenswrapper[9368]: E1203 19:55:46.722485 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca podName:5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:54.722468086 +0000 UTC m=+20.383717997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca") pod "controller-manager-768d5b868-82c4q" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5") : configmap "client-ca" not found Dec 03 19:55:46.872089 master-0 kubenswrapper[9368]: I1203 19:55:46.872044 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 19:55:47.126807 master-0 kubenswrapper[9368]: I1203 19:55:47.126746 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: I1203 19:55:47.126850 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: I1203 19:55:47.126874 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.126928 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.126947 9368 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.126928 9368 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.126994 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:49.126980971 +0000 UTC m=+14.788230872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : configmap "audit-0" not found Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.127017 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:49.126999871 +0000 UTC m=+14.788249782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "etcd-client" not found Dec 03 19:55:47.127537 master-0 kubenswrapper[9368]: E1203 19:55:47.127032 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:49.127026812 +0000 UTC m=+14.788276723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "serving-cert" not found Dec 03 19:55:47.714871 master-0 kubenswrapper[9368]: I1203 19:55:47.714574 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l789w" event={"ID":"d7171597-cb9a-451c-80a4-64cfccf885f0","Type":"ContainerStarted","Data":"137caece2a953cee4091430a595bc0f222af1917d95fa18746325e8b65eac41d"} Dec 03 19:55:47.714871 master-0 kubenswrapper[9368]: I1203 19:55:47.714873 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l789w" event={"ID":"d7171597-cb9a-451c-80a4-64cfccf885f0","Type":"ContainerStarted","Data":"60f0cb36c47fd9046d1059d86dff60d9f8e02831ea35b7e68a92310a9b8dd92a"} Dec 03 19:55:47.718623 master-0 kubenswrapper[9368]: I1203 19:55:47.718591 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerStarted","Data":"9936bd164d7a83dfd6c86c4312838d63181895add63b7d1de35a090b8b7d369b"} Dec 03 19:55:47.777954 master-0 kubenswrapper[9368]: I1203 19:55:47.777876 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-l789w" podStartSLOduration=1.7778550929999999 podStartE2EDuration="1.777855093s" podCreationTimestamp="2025-12-03 19:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:47.734189904 +0000 UTC m=+13.395439815" watchObservedRunningTime="2025-12-03 19:55:47.777855093 +0000 UTC m=+13.439105004" Dec 03 19:55:48.008407 master-0 kubenswrapper[9368]: I1203 19:55:48.008242 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:55:48.008699 master-0 kubenswrapper[9368]: I1203 19:55:48.008665 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.011065 master-0 kubenswrapper[9368]: I1203 19:55:48.010407 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 03 19:55:48.017942 master-0 kubenswrapper[9368]: I1203 19:55:48.017907 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:55:48.039895 master-0 kubenswrapper[9368]: I1203 19:55:48.039837 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.039895 master-0 kubenswrapper[9368]: I1203 19:55:48.039898 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.040108 master-0 kubenswrapper[9368]: I1203 19:55:48.039978 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.140819 master-0 kubenswrapper[9368]: I1203 19:55:48.140723 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.141204 master-0 kubenswrapper[9368]: I1203 19:55:48.140877 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.141204 master-0 kubenswrapper[9368]: I1203 19:55:48.140994 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.141312 master-0 kubenswrapper[9368]: I1203 19:55:48.141212 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.141312 master-0 kubenswrapper[9368]: I1203 19:55:48.141221 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.169847 master-0 kubenswrapper[9368]: I1203 19:55:48.169771 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access\") pod \"installer-1-master-0\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:48.356600 master-0 kubenswrapper[9368]: I1203 19:55:48.356378 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:55:49.168955 master-0 kubenswrapper[9368]: I1203 19:55:49.168860 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.168955 master-0 kubenswrapper[9368]: I1203 19:55:49.168952 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.170983 master-0 kubenswrapper[9368]: E1203 19:55:49.169096 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:49.170983 master-0 kubenswrapper[9368]: E1203 19:55:49.169192 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:53.169167583 +0000 UTC m=+18.830417514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : secret "serving-cert" not found Dec 03 19:55:49.170983 master-0 kubenswrapper[9368]: I1203 19:55:49.169113 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.170983 master-0 kubenswrapper[9368]: E1203 19:55:49.169891 9368 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 19:55:49.170983 master-0 kubenswrapper[9368]: E1203 19:55:49.169926 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit podName:14f3df7c-082e-4555-b545-6b4287f4c1a1 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:53.169916111 +0000 UTC m=+18.831166032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit") pod "apiserver-86f4cd54cb-7c5dq" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1") : configmap "audit-0" not found Dec 03 19:55:49.176307 master-0 kubenswrapper[9368]: I1203 19:55:49.176246 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"apiserver-86f4cd54cb-7c5dq\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.665728 master-0 kubenswrapper[9368]: I1203 19:55:49.665674 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-86f4cd54cb-7c5dq"] Dec 03 19:55:49.666384 master-0 kubenswrapper[9368]: E1203 19:55:49.666347 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" podUID="14f3df7c-082e-4555-b545-6b4287f4c1a1" Dec 03 19:55:49.724766 master-0 kubenswrapper[9368]: I1203 19:55:49.724721 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.734010 master-0 kubenswrapper[9368]: I1203 19:55:49.733975 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775419 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775466 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775525 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775553 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l8m2\" (UniqueName: \"kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775586 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775610 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775639 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775664 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775700 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir\") pod \"14f3df7c-082e-4555-b545-6b4287f4c1a1\" (UID: \"14f3df7c-082e-4555-b545-6b4287f4c1a1\") " Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775755 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:55:49.776039 master-0 kubenswrapper[9368]: I1203 19:55:49.775993 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config" (OuterVolumeSpecName: "config") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:49.776581 master-0 kubenswrapper[9368]: I1203 19:55:49.776090 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:55:49.776581 master-0 kubenswrapper[9368]: I1203 19:55:49.776193 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:49.776581 master-0 kubenswrapper[9368]: I1203 19:55:49.776325 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:49.776581 master-0 kubenswrapper[9368]: I1203 19:55:49.776436 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:49.778637 master-0 kubenswrapper[9368]: I1203 19:55:49.778610 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:55:49.778987 master-0 kubenswrapper[9368]: I1203 19:55:49.778925 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:55:49.798889 master-0 kubenswrapper[9368]: I1203 19:55:49.797764 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2" (OuterVolumeSpecName: "kube-api-access-5l8m2") pod "14f3df7c-082e-4555-b545-6b4287f4c1a1" (UID: "14f3df7c-082e-4555-b545-6b4287f4c1a1"). InnerVolumeSpecName "kube-api-access-5l8m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:55:49.877202 master-0 kubenswrapper[9368]: I1203 19:55:49.877143 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l8m2\" (UniqueName: \"kubernetes.io/projected/14f3df7c-082e-4555-b545-6b4287f4c1a1-kube-api-access-5l8m2\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877202 master-0 kubenswrapper[9368]: I1203 19:55:49.877208 9368 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877232 9368 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877249 9368 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-image-import-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877266 9368 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-encryption-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877281 9368 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-etcd-client\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877296 9368 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877311 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:49.877397 master-0 kubenswrapper[9368]: I1203 19:55:49.877328 9368 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:50.737949 master-0 kubenswrapper[9368]: I1203 19:55:50.735100 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" event={"ID":"5decce88-c71e-411c-87b5-a37dd0f77e7b","Type":"ContainerStarted","Data":"ce3971a00b14ee7d8820c7e2ce38f070172641049e39dce3eb3a076d83a464ea"} Dec 03 19:55:50.739534 master-0 kubenswrapper[9368]: I1203 19:55:50.739325 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389"} Dec 03 19:55:50.740571 master-0 kubenswrapper[9368]: I1203 19:55:50.740527 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" event={"ID":"128ed384-7ab6-41b6-bf45-c8fda917d52f","Type":"ContainerStarted","Data":"5410d9a8f5d8f6148e42af7d496c06c178d9d3caeec5208829cfebc5cf792c8b"} Dec 03 19:55:50.743493 master-0 kubenswrapper[9368]: I1203 19:55:50.743461 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-86f4cd54cb-7c5dq" Dec 03 19:55:50.745699 master-0 kubenswrapper[9368]: I1203 19:55:50.744269 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" event={"ID":"0c45d22f-1492-47d7-83b6-6dd356a8454d","Type":"ContainerStarted","Data":"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671"} Dec 03 19:55:50.810803 master-0 kubenswrapper[9368]: I1203 19:55:50.810634 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-b46c54696-bgb45"] Dec 03 19:55:50.811710 master-0 kubenswrapper[9368]: I1203 19:55:50.811671 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.814987 master-0 kubenswrapper[9368]: I1203 19:55:50.814845 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 19:55:50.815098 master-0 kubenswrapper[9368]: I1203 19:55:50.815032 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 19:55:50.815870 master-0 kubenswrapper[9368]: I1203 19:55:50.815153 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 19:55:50.818463 master-0 kubenswrapper[9368]: I1203 19:55:50.818427 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 19:55:50.818553 master-0 kubenswrapper[9368]: I1203 19:55:50.818505 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 19:55:50.818694 master-0 kubenswrapper[9368]: I1203 19:55:50.818635 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 19:55:50.818694 master-0 kubenswrapper[9368]: I1203 19:55:50.818671 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 19:55:50.818798 master-0 kubenswrapper[9368]: I1203 19:55:50.818695 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-86f4cd54cb-7c5dq"] Dec 03 19:55:50.818798 master-0 kubenswrapper[9368]: I1203 19:55:50.818733 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 19:55:50.818798 master-0 kubenswrapper[9368]: I1203 19:55:50.818767 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 19:55:50.829578 master-0 kubenswrapper[9368]: I1203 19:55:50.829170 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 19:55:50.832407 master-0 kubenswrapper[9368]: I1203 19:55:50.832021 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-b46c54696-bgb45"] Dec 03 19:55:50.832407 master-0 kubenswrapper[9368]: I1203 19:55:50.832269 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-86f4cd54cb-7c5dq"] Dec 03 19:55:50.848104 master-0 kubenswrapper[9368]: I1203 19:55:50.847938 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:55:50.890939 master-0 kubenswrapper[9368]: I1203 19:55:50.890880 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.890939 master-0 kubenswrapper[9368]: I1203 19:55:50.890921 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.890939 master-0 kubenswrapper[9368]: I1203 19:55:50.890941 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891102 master-0 kubenswrapper[9368]: I1203 19:55:50.890970 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891102 master-0 kubenswrapper[9368]: I1203 19:55:50.890997 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891102 master-0 kubenswrapper[9368]: I1203 19:55:50.891013 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891102 master-0 kubenswrapper[9368]: I1203 19:55:50.891030 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891113 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xcx\" (UniqueName: \"kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891132 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891172 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891197 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891239 9368 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/14f3df7c-082e-4555-b545-6b4287f4c1a1-audit\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:50.891288 master-0 kubenswrapper[9368]: I1203 19:55:50.891250 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f3df7c-082e-4555-b545-6b4287f4c1a1-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:50.928817 master-0 kubenswrapper[9368]: I1203 19:55:50.928735 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-768d5b868-82c4q"] Dec 03 19:55:50.929048 master-0 kubenswrapper[9368]: E1203 19:55:50.928995 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" podUID="5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" Dec 03 19:55:50.992446 master-0 kubenswrapper[9368]: I1203 19:55:50.992395 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992446 master-0 kubenswrapper[9368]: I1203 19:55:50.992444 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992465 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992480 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992511 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992528 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992544 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992573 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xcx\" (UniqueName: \"kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992593 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992639 master-0 kubenswrapper[9368]: I1203 19:55:50.992632 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.992897 master-0 kubenswrapper[9368]: I1203 19:55:50.992656 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.993812 master-0 kubenswrapper[9368]: I1203 19:55:50.993770 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994369 master-0 kubenswrapper[9368]: I1203 19:55:50.993941 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994369 master-0 kubenswrapper[9368]: I1203 19:55:50.994008 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994369 master-0 kubenswrapper[9368]: E1203 19:55:50.994069 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:50.994369 master-0 kubenswrapper[9368]: E1203 19:55:50.994149 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert podName:c593a75e-c2af-4419-94da-e0c9ff14c41f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:51.494129137 +0000 UTC m=+17.155379138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert") pod "apiserver-b46c54696-bgb45" (UID: "c593a75e-c2af-4419-94da-e0c9ff14c41f") : secret "serving-cert" not found Dec 03 19:55:50.994369 master-0 kubenswrapper[9368]: I1203 19:55:50.994201 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994624 master-0 kubenswrapper[9368]: I1203 19:55:50.994512 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994672 master-0 kubenswrapper[9368]: I1203 19:55:50.994637 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.994902 master-0 kubenswrapper[9368]: I1203 19:55:50.994867 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.996579 master-0 kubenswrapper[9368]: I1203 19:55:50.996551 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:50.996726 master-0 kubenswrapper[9368]: I1203 19:55:50.996699 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:51.025167 master-0 kubenswrapper[9368]: I1203 19:55:51.025135 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xcx\" (UniqueName: \"kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:51.296465 master-0 kubenswrapper[9368]: I1203 19:55:51.295607 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:51.296465 master-0 kubenswrapper[9368]: I1203 19:55:51.296430 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:51.296651 master-0 kubenswrapper[9368]: I1203 19:55:51.296495 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:51.296651 master-0 kubenswrapper[9368]: I1203 19:55:51.296527 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:51.299402 master-0 kubenswrapper[9368]: I1203 19:55:51.299362 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:51.299479 master-0 kubenswrapper[9368]: I1203 19:55:51.299422 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:51.299995 master-0 kubenswrapper[9368]: I1203 19:55:51.299963 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:51.300046 master-0 kubenswrapper[9368]: I1203 19:55:51.299999 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:51.398047 master-0 kubenswrapper[9368]: I1203 19:55:51.398005 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:51.398047 master-0 kubenswrapper[9368]: I1203 19:55:51.398054 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:51.398257 master-0 kubenswrapper[9368]: I1203 19:55:51.398090 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:51.398257 master-0 kubenswrapper[9368]: I1203 19:55:51.398116 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:51.398257 master-0 kubenswrapper[9368]: I1203 19:55:51.398171 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:55:51.398336 master-0 kubenswrapper[9368]: E1203 19:55:51.398279 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:51.398336 master-0 kubenswrapper[9368]: E1203 19:55:51.398323 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:56:07.398310433 +0000 UTC m=+33.059560344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:55:51.398953 master-0 kubenswrapper[9368]: E1203 19:55:51.398904 9368 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 19:55:51.399017 master-0 kubenswrapper[9368]: E1203 19:55:51.398999 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:56:07.39898016 +0000 UTC m=+33.060230151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : secret "serving-cert" not found Dec 03 19:55:51.406678 master-0 kubenswrapper[9368]: I1203 19:55:51.403539 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:51.406678 master-0 kubenswrapper[9368]: I1203 19:55:51.405671 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:51.406678 master-0 kubenswrapper[9368]: I1203 19:55:51.405978 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 19:55:51.406678 master-0 kubenswrapper[9368]: I1203 19:55:51.406557 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:55:51.408044 master-0 kubenswrapper[9368]: I1203 19:55:51.407954 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:55:51.409025 master-0 kubenswrapper[9368]: I1203 19:55:51.408998 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-nqn2j\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:51.415640 master-0 kubenswrapper[9368]: I1203 19:55:51.414107 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dbfhg"] Dec 03 19:55:51.415640 master-0 kubenswrapper[9368]: I1203 19:55:51.415254 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.415640 master-0 kubenswrapper[9368]: I1203 19:55:51.415419 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:55:51.416593 master-0 kubenswrapper[9368]: I1203 19:55:51.416560 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 19:55:51.416795 master-0 kubenswrapper[9368]: I1203 19:55:51.416649 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:51.417284 master-0 kubenswrapper[9368]: I1203 19:55:51.417211 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 19:55:51.418064 master-0 kubenswrapper[9368]: I1203 19:55:51.417633 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 19:55:51.418064 master-0 kubenswrapper[9368]: I1203 19:55:51.417769 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 19:55:51.430297 master-0 kubenswrapper[9368]: I1203 19:55:51.430248 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 19:55:51.432085 master-0 kubenswrapper[9368]: I1203 19:55:51.432015 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 19:55:51.432308 master-0 kubenswrapper[9368]: I1203 19:55:51.432275 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dbfhg"] Dec 03 19:55:51.498898 master-0 kubenswrapper[9368]: I1203 19:55:51.498860 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.499043 master-0 kubenswrapper[9368]: I1203 19:55:51.498914 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.499043 master-0 kubenswrapper[9368]: I1203 19:55:51.499016 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tphq2\" (UniqueName: \"kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.499143 master-0 kubenswrapper[9368]: I1203 19:55:51.499058 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:51.499196 master-0 kubenswrapper[9368]: E1203 19:55:51.499164 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:51.499235 master-0 kubenswrapper[9368]: E1203 19:55:51.499212 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert podName:c593a75e-c2af-4419-94da-e0c9ff14c41f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:52.499197545 +0000 UTC m=+18.160447456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert") pod "apiserver-b46c54696-bgb45" (UID: "c593a75e-c2af-4419-94da-e0c9ff14c41f") : secret "serving-cert" not found Dec 03 19:55:51.599706 master-0 kubenswrapper[9368]: I1203 19:55:51.599484 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.599706 master-0 kubenswrapper[9368]: I1203 19:55:51.599543 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphq2\" (UniqueName: \"kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.599706 master-0 kubenswrapper[9368]: I1203 19:55:51.599587 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.599706 master-0 kubenswrapper[9368]: E1203 19:55:51.599687 9368 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Dec 03 19:55:51.599873 master-0 kubenswrapper[9368]: E1203 19:55:51.599732 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls podName:d196dca7-f940-4aa0-b20a-214d22b62db6 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:52.099719379 +0000 UTC m=+17.760969290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls") pod "dns-default-dbfhg" (UID: "d196dca7-f940-4aa0-b20a-214d22b62db6") : secret "dns-default-metrics-tls" not found Dec 03 19:55:51.600335 master-0 kubenswrapper[9368]: I1203 19:55:51.600293 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.623906 master-0 kubenswrapper[9368]: I1203 19:55:51.621916 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphq2\" (UniqueName: \"kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:51.765085 master-0 kubenswrapper[9368]: I1203 19:55:51.765015 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"72827135272ae5ceb8d19731a24ef08af7394142519a927cde3a2e8735408cbe"} Dec 03 19:55:51.769352 master-0 kubenswrapper[9368]: I1203 19:55:51.768598 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" event={"ID":"128ed384-7ab6-41b6-bf45-c8fda917d52f","Type":"ContainerStarted","Data":"dfd0cb901310111caab1d8aa0658d957f162d4e2531dfcbcc7ceee5bcdb5ac53"} Dec 03 19:55:51.781673 master-0 kubenswrapper[9368]: I1203 19:55:51.781532 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:51.782113 master-0 kubenswrapper[9368]: I1203 19:55:51.781538 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"692f1783-2d80-48a7-af1b-58a1f3f99315","Type":"ContainerStarted","Data":"a09bb661632ae0edd2e2be2bbaaf640ad99daf86961d4f77ca5c520617eeae7b"} Dec 03 19:55:51.782113 master-0 kubenswrapper[9368]: I1203 19:55:51.781749 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"692f1783-2d80-48a7-af1b-58a1f3f99315","Type":"ContainerStarted","Data":"f126dafdd5aa72693eb55c4dcedd323ed5a556c45b737d2405e2a32737d0b414"} Dec 03 19:55:51.845590 master-0 kubenswrapper[9368]: I1203 19:55:51.845508 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hk22l"] Dec 03 19:55:51.845759 master-0 kubenswrapper[9368]: I1203 19:55:51.845639 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=3.845628164 podStartE2EDuration="3.845628164s" podCreationTimestamp="2025-12-03 19:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:51.844223719 +0000 UTC m=+17.505473640" watchObservedRunningTime="2025-12-03 19:55:51.845628164 +0000 UTC m=+17.506878075" Dec 03 19:55:51.847633 master-0 kubenswrapper[9368]: I1203 19:55:51.846122 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:51.847633 master-0 kubenswrapper[9368]: I1203 19:55:51.846148 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:51.902413 master-0 kubenswrapper[9368]: I1203 19:55:51.902367 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") pod \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " Dec 03 19:55:51.902413 master-0 kubenswrapper[9368]: I1203 19:55:51.902413 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pczf\" (UniqueName: \"kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf\") pod \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " Dec 03 19:55:51.902550 master-0 kubenswrapper[9368]: I1203 19:55:51.902437 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config\") pod \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " Dec 03 19:55:51.902550 master-0 kubenswrapper[9368]: I1203 19:55:51.902488 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles\") pod \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\" (UID: \"5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5\") " Dec 03 19:55:51.902644 master-0 kubenswrapper[9368]: I1203 19:55:51.902620 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvlxr\" (UniqueName: \"kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:51.905587 master-0 kubenswrapper[9368]: I1203 19:55:51.903232 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config" (OuterVolumeSpecName: "config") pod "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:51.905587 master-0 kubenswrapper[9368]: I1203 19:55:51.904065 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:51.905587 master-0 kubenswrapper[9368]: I1203 19:55:51.904158 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:51.906446 master-0 kubenswrapper[9368]: I1203 19:55:51.906293 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:55:51.910356 master-0 kubenswrapper[9368]: I1203 19:55:51.907839 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:55:51.919052 master-0 kubenswrapper[9368]: I1203 19:55:51.919002 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf" (OuterVolumeSpecName: "kube-api-access-2pczf") pod "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" (UID: "5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5"). InnerVolumeSpecName "kube-api-access-2pczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.005995 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.006046 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvlxr\" (UniqueName: \"kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.006077 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.006088 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pczf\" (UniqueName: \"kubernetes.io/projected/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-kube-api-access-2pczf\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.006097 9368 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:52.006359 master-0 kubenswrapper[9368]: I1203 19:55:52.006178 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:52.030514 master-0 kubenswrapper[9368]: I1203 19:55:52.030482 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvlxr\" (UniqueName: \"kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:52.107090 master-0 kubenswrapper[9368]: I1203 19:55:52.106976 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:52.107251 master-0 kubenswrapper[9368]: E1203 19:55:52.107149 9368 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Dec 03 19:55:52.107251 master-0 kubenswrapper[9368]: E1203 19:55:52.107219 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls podName:d196dca7-f940-4aa0-b20a-214d22b62db6 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:53.107203899 +0000 UTC m=+18.768453810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls") pod "dns-default-dbfhg" (UID: "d196dca7-f940-4aa0-b20a-214d22b62db6") : secret "dns-default-metrics-tls" not found Dec 03 19:55:52.168132 master-0 kubenswrapper[9368]: I1203 19:55:52.168072 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6"] Dec 03 19:55:52.175637 master-0 kubenswrapper[9368]: I1203 19:55:52.173972 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hk22l" Dec 03 19:55:52.176360 master-0 kubenswrapper[9368]: W1203 19:55:52.176315 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb673cb04_f6f0_4113_bdcd_d6685b942c9f.slice/crio-c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea WatchSource:0}: Error finding container c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea: Status 404 returned error can't find the container with id c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea Dec 03 19:55:52.177416 master-0 kubenswrapper[9368]: I1203 19:55:52.177379 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv"] Dec 03 19:55:52.181963 master-0 kubenswrapper[9368]: I1203 19:55:52.181933 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n"] Dec 03 19:55:52.186418 master-0 kubenswrapper[9368]: I1203 19:55:52.186381 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q"] Dec 03 19:55:52.283068 master-0 kubenswrapper[9368]: I1203 19:55:52.283014 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw"] Dec 03 19:55:52.297128 master-0 kubenswrapper[9368]: I1203 19:55:52.297043 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 19:55:52.298600 master-0 kubenswrapper[9368]: I1203 19:55:52.298496 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hs6gf"] Dec 03 19:55:52.298600 master-0 kubenswrapper[9368]: W1203 19:55:52.298567 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19b8f9e_6299_43bf_9aa5_22071b855773.slice/crio-ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22 WatchSource:0}: Error finding container ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22: Status 404 returned error can't find the container with id ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22 Dec 03 19:55:52.313736 master-0 kubenswrapper[9368]: W1203 19:55:52.313695 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b5d4d0_b841_4e87_84b4_85911ff04325.slice/crio-d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13 WatchSource:0}: Error finding container d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13: Status 404 returned error can't find the container with id d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13 Dec 03 19:55:52.511237 master-0 kubenswrapper[9368]: I1203 19:55:52.511188 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:52.511415 master-0 kubenswrapper[9368]: E1203 19:55:52.511385 9368 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 19:55:52.511500 master-0 kubenswrapper[9368]: E1203 19:55:52.511479 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert podName:c593a75e-c2af-4419-94da-e0c9ff14c41f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:54.511457017 +0000 UTC m=+20.172706998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert") pod "apiserver-b46c54696-bgb45" (UID: "c593a75e-c2af-4419-94da-e0c9ff14c41f") : secret "serving-cert" not found Dec 03 19:55:52.563798 master-0 kubenswrapper[9368]: I1203 19:55:52.561751 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f3df7c-082e-4555-b545-6b4287f4c1a1" path="/var/lib/kubelet/pods/14f3df7c-082e-4555-b545-6b4287f4c1a1/volumes" Dec 03 19:55:52.790887 master-0 kubenswrapper[9368]: I1203 19:55:52.788180 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" event={"ID":"b673cb04-f6f0-4113-bdcd-d6685b942c9f","Type":"ContainerStarted","Data":"c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea"} Dec 03 19:55:52.790887 master-0 kubenswrapper[9368]: I1203 19:55:52.789830 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" event={"ID":"a19b8f9e-6299-43bf-9aa5-22071b855773","Type":"ContainerStarted","Data":"ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22"} Dec 03 19:55:52.798605 master-0 kubenswrapper[9368]: I1203 19:55:52.791822 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" event={"ID":"ba68608f-6b36-455e-b80b-d19237df9312","Type":"ContainerStarted","Data":"75ad2809d96a1369619e26966fceb45e6c13fc754c6dc35b21749d37ba20ab2a"} Dec 03 19:55:52.798605 master-0 kubenswrapper[9368]: I1203 19:55:52.793998 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerStarted","Data":"42e1b375dcaebdf8d6351192223452d8b91294cb866b8e2a93c4bc9df5e70f90"} Dec 03 19:55:52.798605 master-0 kubenswrapper[9368]: I1203 19:55:52.795658 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hs6gf" event={"ID":"46b5d4d0-b841-4e87-84b4-85911ff04325","Type":"ContainerStarted","Data":"d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13"} Dec 03 19:55:52.798605 master-0 kubenswrapper[9368]: I1203 19:55:52.798383 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" event={"ID":"d5f33153-bff1-403f-ae17-b7e90500365d","Type":"ContainerStarted","Data":"9a60557e4a853b254e1a52367430f6552fb59c31039de6af8378df26f94038fb"} Dec 03 19:55:52.802534 master-0 kubenswrapper[9368]: I1203 19:55:52.802489 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" event={"ID":"0d4e4f88-7106-4a46-8b63-053345922fb0","Type":"ContainerStarted","Data":"82c8f09a4e348df690ae32454c1c4a9c81ec7e77a1b025b622f00d261c340273"} Dec 03 19:55:52.802655 master-0 kubenswrapper[9368]: I1203 19:55:52.802562 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" event={"ID":"0d4e4f88-7106-4a46-8b63-053345922fb0","Type":"ContainerStarted","Data":"e2387dbfcc1d429cb65e949d260da12685f9167ab5d7e2e2846349bd7d4f915e"} Dec 03 19:55:52.805640 master-0 kubenswrapper[9368]: I1203 19:55:52.805549 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hk22l" event={"ID":"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f","Type":"ContainerStarted","Data":"744ae608acaa8af85d60fc01490827db0f98f2477c87f8cd6e589181aaae0848"} Dec 03 19:55:52.805640 master-0 kubenswrapper[9368]: I1203 19:55:52.805573 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hk22l" event={"ID":"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f","Type":"ContainerStarted","Data":"6a5af31c4c1e2f84958d04c9531001f07d3ef520fdf16d375a2d25f61196cfa7"} Dec 03 19:55:52.806106 master-0 kubenswrapper[9368]: I1203 19:55:52.806076 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-768d5b868-82c4q" Dec 03 19:55:52.819957 master-0 kubenswrapper[9368]: I1203 19:55:52.819864 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hk22l" podStartSLOduration=1.8198470900000001 podStartE2EDuration="1.81984709s" podCreationTimestamp="2025-12-03 19:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:55:52.819171433 +0000 UTC m=+18.480421374" watchObservedRunningTime="2025-12-03 19:55:52.81984709 +0000 UTC m=+18.481097011" Dec 03 19:55:52.840827 master-0 kubenswrapper[9368]: I1203 19:55:52.840723 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-768d5b868-82c4q"] Dec 03 19:55:52.846166 master-0 kubenswrapper[9368]: I1203 19:55:52.846108 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6877695c95-4kmf4"] Dec 03 19:55:52.847865 master-0 kubenswrapper[9368]: I1203 19:55:52.847570 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-768d5b868-82c4q"] Dec 03 19:55:52.847865 master-0 kubenswrapper[9368]: I1203 19:55:52.847721 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.853457 master-0 kubenswrapper[9368]: I1203 19:55:52.850572 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 19:55:52.853457 master-0 kubenswrapper[9368]: I1203 19:55:52.851068 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 19:55:52.853457 master-0 kubenswrapper[9368]: I1203 19:55:52.851388 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 19:55:52.853457 master-0 kubenswrapper[9368]: I1203 19:55:52.851749 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 19:55:52.853457 master-0 kubenswrapper[9368]: I1203 19:55:52.852028 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 19:55:52.853916 master-0 kubenswrapper[9368]: I1203 19:55:52.853875 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6877695c95-4kmf4"] Dec 03 19:55:52.858035 master-0 kubenswrapper[9368]: I1203 19:55:52.856214 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 19:55:52.915561 master-0 kubenswrapper[9368]: I1203 19:55:52.915515 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.915802 master-0 kubenswrapper[9368]: I1203 19:55:52.915610 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jczb\" (UniqueName: \"kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.922184 master-0 kubenswrapper[9368]: I1203 19:55:52.922129 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.922292 master-0 kubenswrapper[9368]: I1203 19:55:52.922248 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.922335 master-0 kubenswrapper[9368]: I1203 19:55:52.922286 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:52.922384 master-0 kubenswrapper[9368]: I1203 19:55:52.922369 9368 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:55:53.024802 master-0 kubenswrapper[9368]: I1203 19:55:53.023791 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.024802 master-0 kubenswrapper[9368]: I1203 19:55:53.023883 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jczb\" (UniqueName: \"kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.024802 master-0 kubenswrapper[9368]: I1203 19:55:53.023941 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.024802 master-0 kubenswrapper[9368]: I1203 19:55:53.023973 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.024802 master-0 kubenswrapper[9368]: I1203 19:55:53.023994 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.036802 master-0 kubenswrapper[9368]: I1203 19:55:53.025283 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.036802 master-0 kubenswrapper[9368]: E1203 19:55:53.025857 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:53.036802 master-0 kubenswrapper[9368]: E1203 19:55:53.026058 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:53.526007156 +0000 UTC m=+19.187257067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:55:53.036802 master-0 kubenswrapper[9368]: I1203 19:55:53.028434 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.036802 master-0 kubenswrapper[9368]: I1203 19:55:53.029043 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.042315 master-0 kubenswrapper[9368]: I1203 19:55:53.042088 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jczb\" (UniqueName: \"kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.126045 master-0 kubenswrapper[9368]: I1203 19:55:53.126000 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:53.126223 master-0 kubenswrapper[9368]: E1203 19:55:53.126206 9368 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Dec 03 19:55:53.126332 master-0 kubenswrapper[9368]: E1203 19:55:53.126312 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls podName:d196dca7-f940-4aa0-b20a-214d22b62db6 nodeName:}" failed. No retries permitted until 2025-12-03 19:55:55.126296894 +0000 UTC m=+20.787546805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls") pod "dns-default-dbfhg" (UID: "d196dca7-f940-4aa0-b20a-214d22b62db6") : secret "dns-default-metrics-tls" not found Dec 03 19:55:53.539624 master-0 kubenswrapper[9368]: I1203 19:55:53.539576 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:53.539828 master-0 kubenswrapper[9368]: E1203 19:55:53.539807 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:53.539867 master-0 kubenswrapper[9368]: E1203 19:55:53.539860 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:54.539848108 +0000 UTC m=+20.201098019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:55:54.550468 master-0 kubenswrapper[9368]: I1203 19:55:54.550422 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5" path="/var/lib/kubelet/pods/5bdd41b6-0c74-4d23-9c81-0f7f70d9c6f5/volumes" Dec 03 19:55:54.553073 master-0 kubenswrapper[9368]: I1203 19:55:54.553013 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:54.553204 master-0 kubenswrapper[9368]: I1203 19:55:54.553173 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:54.553306 master-0 kubenswrapper[9368]: E1203 19:55:54.553282 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:54.553364 master-0 kubenswrapper[9368]: E1203 19:55:54.553340 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:55:56.55332679 +0000 UTC m=+22.214576701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:55:54.557287 master-0 kubenswrapper[9368]: I1203 19:55:54.557223 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:54.735545 master-0 kubenswrapper[9368]: I1203 19:55:54.735467 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:55:55.158720 master-0 kubenswrapper[9368]: I1203 19:55:55.158667 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:55.166659 master-0 kubenswrapper[9368]: I1203 19:55:55.166613 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:55.446229 master-0 kubenswrapper[9368]: I1203 19:55:55.446099 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dbfhg" Dec 03 19:55:56.528182 master-0 kubenswrapper[9368]: I1203 19:55:56.526636 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dbfhg"] Dec 03 19:55:56.538242 master-0 kubenswrapper[9368]: W1203 19:55:56.538207 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd196dca7_f940_4aa0_b20a_214d22b62db6.slice/crio-0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5 WatchSource:0}: Error finding container 0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5: Status 404 returned error can't find the container with id 0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5 Dec 03 19:55:56.582445 master-0 kubenswrapper[9368]: I1203 19:55:56.582370 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:55:56.582839 master-0 kubenswrapper[9368]: E1203 19:55:56.582505 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:55:56.582839 master-0 kubenswrapper[9368]: E1203 19:55:56.582571 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:56:00.582543563 +0000 UTC m=+26.243793484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:55:56.586641 master-0 kubenswrapper[9368]: I1203 19:55:56.586599 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-b46c54696-bgb45"] Dec 03 19:55:56.608721 master-0 kubenswrapper[9368]: W1203 19:55:56.605282 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc593a75e_c2af_4419_94da_e0c9ff14c41f.slice/crio-06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9 WatchSource:0}: Error finding container 06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9: Status 404 returned error can't find the container with id 06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9 Dec 03 19:55:56.825902 master-0 kubenswrapper[9368]: I1203 19:55:56.825806 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hs6gf" event={"ID":"46b5d4d0-b841-4e87-84b4-85911ff04325","Type":"ContainerStarted","Data":"a3f214b76069ead29156acedaf0e2b0c7ae7bb30a892a0bcd0d028e94a27937d"} Dec 03 19:55:56.825902 master-0 kubenswrapper[9368]: I1203 19:55:56.825879 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hs6gf" event={"ID":"46b5d4d0-b841-4e87-84b4-85911ff04325","Type":"ContainerStarted","Data":"423ec2b156c6d94d5c6e5f4007ef950fc3bc548c2375a5329992d8c48a37db8f"} Dec 03 19:55:56.831493 master-0 kubenswrapper[9368]: I1203 19:55:56.831440 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-b46c54696-bgb45" event={"ID":"c593a75e-c2af-4419-94da-e0c9ff14c41f","Type":"ContainerStarted","Data":"06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9"} Dec 03 19:55:56.832994 master-0 kubenswrapper[9368]: I1203 19:55:56.832942 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" event={"ID":"ba68608f-6b36-455e-b80b-d19237df9312","Type":"ContainerStarted","Data":"6df842bd6d68b04f8489a3a8320f20734ae4b6f9e1bbd34295dabde01ec89569"} Dec 03 19:55:56.835111 master-0 kubenswrapper[9368]: I1203 19:55:56.834796 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" event={"ID":"b673cb04-f6f0-4113-bdcd-d6685b942c9f","Type":"ContainerStarted","Data":"efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6"} Dec 03 19:55:56.835111 master-0 kubenswrapper[9368]: I1203 19:55:56.834975 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:56.839489 master-0 kubenswrapper[9368]: I1203 19:55:56.839413 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerStarted","Data":"77dceba290fd067cd611c6d2a5e4c623247f11076c1771bf8dc8e4af20aaef57"} Dec 03 19:55:56.839642 master-0 kubenswrapper[9368]: I1203 19:55:56.839556 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerStarted","Data":"ccaa5bcc074786e1602c431f92bcbfc1662e1c5b23f45ded5617110476671e11"} Dec 03 19:55:56.842076 master-0 kubenswrapper[9368]: I1203 19:55:56.842018 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 19:55:56.842821 master-0 kubenswrapper[9368]: I1203 19:55:56.842256 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dbfhg" event={"ID":"d196dca7-f940-4aa0-b20a-214d22b62db6","Type":"ContainerStarted","Data":"0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5"} Dec 03 19:55:57.478645 master-0 kubenswrapper[9368]: I1203 19:55:57.478575 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf"] Dec 03 19:55:57.481677 master-0 kubenswrapper[9368]: I1203 19:55:57.481650 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.484164 master-0 kubenswrapper[9368]: I1203 19:55:57.484128 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 19:55:57.493488 master-0 kubenswrapper[9368]: I1203 19:55:57.492641 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 19:55:57.500375 master-0 kubenswrapper[9368]: I1203 19:55:57.500308 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf"] Dec 03 19:55:57.513837 master-0 kubenswrapper[9368]: I1203 19:55:57.513791 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 19:55:57.558283 master-0 kubenswrapper[9368]: I1203 19:55:57.558204 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j"] Dec 03 19:55:57.558952 master-0 kubenswrapper[9368]: I1203 19:55:57.558936 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.560805 master-0 kubenswrapper[9368]: I1203 19:55:57.560763 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 19:55:57.561592 master-0 kubenswrapper[9368]: I1203 19:55:57.561017 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 19:55:57.561592 master-0 kubenswrapper[9368]: I1203 19:55:57.561306 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 19:55:57.566684 master-0 kubenswrapper[9368]: I1203 19:55:57.566649 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 19:55:57.574265 master-0 kubenswrapper[9368]: I1203 19:55:57.574220 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j"] Dec 03 19:55:57.613911 master-0 kubenswrapper[9368]: I1203 19:55:57.612952 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pf5q\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.613911 master-0 kubenswrapper[9368]: I1203 19:55:57.613103 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.613911 master-0 kubenswrapper[9368]: I1203 19:55:57.613194 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.613911 master-0 kubenswrapper[9368]: I1203 19:55:57.613239 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.613911 master-0 kubenswrapper[9368]: I1203 19:55:57.613269 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.714337 master-0 kubenswrapper[9368]: I1203 19:55:57.714269 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714337 master-0 kubenswrapper[9368]: I1203 19:55:57.714322 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714337 master-0 kubenswrapper[9368]: I1203 19:55:57.714353 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714384 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714408 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714443 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pf5q\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714466 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714502 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714533 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95zsj\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714557 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.714829 master-0 kubenswrapper[9368]: I1203 19:55:57.714608 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.715306 master-0 kubenswrapper[9368]: I1203 19:55:57.715274 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.715451 master-0 kubenswrapper[9368]: I1203 19:55:57.715424 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.715821 master-0 kubenswrapper[9368]: I1203 19:55:57.715753 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.719666 master-0 kubenswrapper[9368]: I1203 19:55:57.719630 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.734219 master-0 kubenswrapper[9368]: I1203 19:55:57.734084 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pf5q\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.813341 master-0 kubenswrapper[9368]: I1203 19:55:57.813296 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.815920 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.816168 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95zsj\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.816198 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.816301 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.816361 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.816416 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.817280 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.817637 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.818037 master-0 kubenswrapper[9368]: I1203 19:55:57.817994 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.819903 master-0 kubenswrapper[9368]: I1203 19:55:57.819877 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.822362 master-0 kubenswrapper[9368]: I1203 19:55:57.822330 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.839092 master-0 kubenswrapper[9368]: I1203 19:55:57.839049 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95zsj\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:57.878895 master-0 kubenswrapper[9368]: I1203 19:55:57.878807 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:55:58.408657 master-0 kubenswrapper[9368]: I1203 19:55:58.408591 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:55:58.408902 master-0 kubenswrapper[9368]: I1203 19:55:58.408844 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="692f1783-2d80-48a7-af1b-58a1f3f99315" containerName="installer" containerID="cri-o://a09bb661632ae0edd2e2be2bbaaf640ad99daf86961d4f77ca5c520617eeae7b" gracePeriod=30 Dec 03 19:55:58.703344 master-0 kubenswrapper[9368]: I1203 19:55:58.703160 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb"] Dec 03 19:55:58.704149 master-0 kubenswrapper[9368]: I1203 19:55:58.703905 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.708446 master-0 kubenswrapper[9368]: I1203 19:55:58.707665 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 19:55:58.708446 master-0 kubenswrapper[9368]: I1203 19:55:58.707839 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 19:55:58.708446 master-0 kubenswrapper[9368]: I1203 19:55:58.707901 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 19:55:58.708446 master-0 kubenswrapper[9368]: I1203 19:55:58.708206 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 19:55:58.708446 master-0 kubenswrapper[9368]: I1203 19:55:58.708295 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 19:55:58.708850 master-0 kubenswrapper[9368]: I1203 19:55:58.708759 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 19:55:58.710888 master-0 kubenswrapper[9368]: I1203 19:55:58.709427 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 19:55:58.710888 master-0 kubenswrapper[9368]: I1203 19:55:58.709560 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 19:55:58.715748 master-0 kubenswrapper[9368]: I1203 19:55:58.715625 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb"] Dec 03 19:55:58.835599 master-0 kubenswrapper[9368]: I1203 19:55:58.835518 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835599 master-0 kubenswrapper[9368]: I1203 19:55:58.835583 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835599 master-0 kubenswrapper[9368]: I1203 19:55:58.835609 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835942 master-0 kubenswrapper[9368]: I1203 19:55:58.835673 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835942 master-0 kubenswrapper[9368]: I1203 19:55:58.835695 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835942 master-0 kubenswrapper[9368]: I1203 19:55:58.835719 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhn4\" (UniqueName: \"kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835942 master-0 kubenswrapper[9368]: I1203 19:55:58.835751 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.835942 master-0 kubenswrapper[9368]: I1203 19:55:58.835771 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937359 master-0 kubenswrapper[9368]: I1203 19:55:58.937272 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937359 master-0 kubenswrapper[9368]: I1203 19:55:58.937324 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhn4\" (UniqueName: \"kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937383 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937406 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937452 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937488 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937547 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937568 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.937761 master-0 kubenswrapper[9368]: I1203 19:55:58.937610 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.940151 master-0 kubenswrapper[9368]: I1203 19:55:58.940098 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.943577 master-0 kubenswrapper[9368]: I1203 19:55:58.940260 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.943577 master-0 kubenswrapper[9368]: I1203 19:55:58.941359 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.943921 master-0 kubenswrapper[9368]: I1203 19:55:58.943666 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.944052 master-0 kubenswrapper[9368]: I1203 19:55:58.943975 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.945031 master-0 kubenswrapper[9368]: I1203 19:55:58.944982 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:58.954986 master-0 kubenswrapper[9368]: I1203 19:55:58.954852 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhn4\" (UniqueName: \"kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:55:59.022742 master-0 kubenswrapper[9368]: I1203 19:55:59.022653 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:56:00.544494 master-0 kubenswrapper[9368]: I1203 19:56:00.544436 9368 scope.go:117] "RemoveContainer" containerID="86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23" Dec 03 19:56:00.659402 master-0 kubenswrapper[9368]: I1203 19:56:00.659262 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:56:00.659700 master-0 kubenswrapper[9368]: E1203 19:56:00.659481 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:56:00.659700 master-0 kubenswrapper[9368]: E1203 19:56:00.659622 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:56:08.659586559 +0000 UTC m=+34.320836500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:56:02.632032 master-0 kubenswrapper[9368]: I1203 19:56:02.630302 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:02.632032 master-0 kubenswrapper[9368]: I1203 19:56:02.631137 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.672337 master-0 kubenswrapper[9368]: I1203 19:56:02.670215 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:02.688601 master-0 kubenswrapper[9368]: I1203 19:56:02.688549 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.688837 master-0 kubenswrapper[9368]: I1203 19:56:02.688631 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.688837 master-0 kubenswrapper[9368]: I1203 19:56:02.688716 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.790503 master-0 kubenswrapper[9368]: I1203 19:56:02.790426 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.790741 master-0 kubenswrapper[9368]: I1203 19:56:02.790540 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.790741 master-0 kubenswrapper[9368]: I1203 19:56:02.790582 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.791036 master-0 kubenswrapper[9368]: I1203 19:56:02.790978 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.791036 master-0 kubenswrapper[9368]: I1203 19:56:02.791024 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.819095 master-0 kubenswrapper[9368]: I1203 19:56:02.818975 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:02.989722 master-0 kubenswrapper[9368]: I1203 19:56:02.989566 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:03.905033 master-0 kubenswrapper[9368]: I1203 19:56:03.904953 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-zbl42"] Dec 03 19:56:03.905476 master-0 kubenswrapper[9368]: I1203 19:56:03.905155 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" podUID="0c45d22f-1492-47d7-83b6-6dd356a8454d" containerName="cluster-version-operator" containerID="cri-o://87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671" gracePeriod=130 Dec 03 19:56:04.453813 master-0 kubenswrapper[9368]: I1203 19:56:04.453438 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:56:04.515179 master-0 kubenswrapper[9368]: I1203 19:56:04.515120 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") pod \"0c45d22f-1492-47d7-83b6-6dd356a8454d\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " Dec 03 19:56:04.515287 master-0 kubenswrapper[9368]: I1203 19:56:04.515191 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") pod \"0c45d22f-1492-47d7-83b6-6dd356a8454d\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " Dec 03 19:56:04.515287 master-0 kubenswrapper[9368]: I1203 19:56:04.515213 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") pod \"0c45d22f-1492-47d7-83b6-6dd356a8454d\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " Dec 03 19:56:04.515287 master-0 kubenswrapper[9368]: I1203 19:56:04.515234 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") pod \"0c45d22f-1492-47d7-83b6-6dd356a8454d\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " Dec 03 19:56:04.515287 master-0 kubenswrapper[9368]: I1203 19:56:04.515267 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") pod \"0c45d22f-1492-47d7-83b6-6dd356a8454d\" (UID: \"0c45d22f-1492-47d7-83b6-6dd356a8454d\") " Dec 03 19:56:04.515287 master-0 kubenswrapper[9368]: I1203 19:56:04.515263 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "0c45d22f-1492-47d7-83b6-6dd356a8454d" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:04.515492 master-0 kubenswrapper[9368]: I1203 19:56:04.515327 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "0c45d22f-1492-47d7-83b6-6dd356a8454d" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:04.515492 master-0 kubenswrapper[9368]: I1203 19:56:04.515459 9368 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:04.515492 master-0 kubenswrapper[9368]: I1203 19:56:04.515474 9368 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0c45d22f-1492-47d7-83b6-6dd356a8454d-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:04.515872 master-0 kubenswrapper[9368]: I1203 19:56:04.515834 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca" (OuterVolumeSpecName: "service-ca") pod "0c45d22f-1492-47d7-83b6-6dd356a8454d" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:04.521437 master-0 kubenswrapper[9368]: I1203 19:56:04.521395 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c45d22f-1492-47d7-83b6-6dd356a8454d" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:56:04.522338 master-0 kubenswrapper[9368]: I1203 19:56:04.522295 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0c45d22f-1492-47d7-83b6-6dd356a8454d" (UID: "0c45d22f-1492-47d7-83b6-6dd356a8454d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:04.617316 master-0 kubenswrapper[9368]: I1203 19:56:04.617280 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c45d22f-1492-47d7-83b6-6dd356a8454d-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:04.617527 master-0 kubenswrapper[9368]: I1203 19:56:04.617514 9368 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c45d22f-1492-47d7-83b6-6dd356a8454d-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:04.617595 master-0 kubenswrapper[9368]: I1203 19:56:04.617586 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c45d22f-1492-47d7-83b6-6dd356a8454d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:04.835875 master-0 kubenswrapper[9368]: I1203 19:56:04.835844 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j"] Dec 03 19:56:04.877071 master-0 kubenswrapper[9368]: I1203 19:56:04.876355 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf"] Dec 03 19:56:04.882458 master-0 kubenswrapper[9368]: I1203 19:56:04.882044 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" event={"ID":"0d4e4f88-7106-4a46-8b63-053345922fb0","Type":"ContainerStarted","Data":"2f3d798fc128d08f2b78c16a96552eb1af844c024c5ff08c6a9c3b2ad0da6b71"} Dec 03 19:56:04.882637 master-0 kubenswrapper[9368]: I1203 19:56:04.882619 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:56:04.885974 master-0 kubenswrapper[9368]: I1203 19:56:04.885931 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"6f8d03455884710e737b779ab993de7b077a6712d61dd531eb926a20dcac48c1"} Dec 03 19:56:04.888978 master-0 kubenswrapper[9368]: I1203 19:56:04.888835 9368 generic.go:334] "Generic (PLEG): container finished" podID="c593a75e-c2af-4419-94da-e0c9ff14c41f" containerID="6652c1726daa2f760a59f8139ccfc6f5f17852cbb0841f5678084529cf67893c" exitCode=0 Dec 03 19:56:04.888978 master-0 kubenswrapper[9368]: I1203 19:56:04.888902 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-b46c54696-bgb45" event={"ID":"c593a75e-c2af-4419-94da-e0c9ff14c41f","Type":"ContainerDied","Data":"6652c1726daa2f760a59f8139ccfc6f5f17852cbb0841f5678084529cf67893c"} Dec 03 19:56:04.893652 master-0 kubenswrapper[9368]: I1203 19:56:04.891706 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" event={"ID":"a19b8f9e-6299-43bf-9aa5-22071b855773","Type":"ContainerStarted","Data":"64dd796136fb20fd559cb8e8ea027107f34af29c0bb6d6c8f0a3ff11ebf8aabc"} Dec 03 19:56:04.893652 master-0 kubenswrapper[9368]: I1203 19:56:04.892223 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: I1203 19:56:04.898697 9368 generic.go:334] "Generic (PLEG): container finished" podID="0c45d22f-1492-47d7-83b6-6dd356a8454d" containerID="87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671" exitCode=0 Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: I1203 19:56:04.898759 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" event={"ID":"0c45d22f-1492-47d7-83b6-6dd356a8454d","Type":"ContainerDied","Data":"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671"} Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: I1203 19:56:04.898796 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" event={"ID":"0c45d22f-1492-47d7-83b6-6dd356a8454d","Type":"ContainerDied","Data":"04c8eaa274e2cca2857aa142579311ee009454f560d3839f6a387b3b67a5bfe1"} Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: I1203 19:56:04.898814 9368 scope.go:117] "RemoveContainer" containerID="87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671" Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: I1203 19:56:04.898953 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-zbl42" Dec 03 19:56:04.900960 master-0 kubenswrapper[9368]: W1203 19:56:04.899832 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b7027e_44f5_4c7b_9226_585a90530535.slice/crio-e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70 WatchSource:0}: Error finding container e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70: Status 404 returned error can't find the container with id e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70 Dec 03 19:56:04.901667 master-0 kubenswrapper[9368]: I1203 19:56:04.901564 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 19:56:04.903593 master-0 kubenswrapper[9368]: I1203 19:56:04.903527 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dbfhg" event={"ID":"d196dca7-f940-4aa0-b20a-214d22b62db6","Type":"ContainerStarted","Data":"aa0eef388a8df38f070dcb7bce1e13710c1224b9c3b1496b746c7c2ab84af6a1"} Dec 03 19:56:04.911730 master-0 kubenswrapper[9368]: I1203 19:56:04.911686 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" event={"ID":"d5f33153-bff1-403f-ae17-b7e90500365d","Type":"ContainerStarted","Data":"65424186b7f37ca5dabafb1787258784712f7c583675d920ab0fbd16681af536"} Dec 03 19:56:04.912302 master-0 kubenswrapper[9368]: I1203 19:56:04.912280 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:56:04.917425 master-0 kubenswrapper[9368]: I1203 19:56:04.917388 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 19:56:04.925232 master-0 kubenswrapper[9368]: I1203 19:56:04.925196 9368 scope.go:117] "RemoveContainer" containerID="87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671" Dec 03 19:56:04.930510 master-0 kubenswrapper[9368]: E1203 19:56:04.930462 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671\": container with ID starting with 87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671 not found: ID does not exist" containerID="87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671" Dec 03 19:56:04.930676 master-0 kubenswrapper[9368]: I1203 19:56:04.930507 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671"} err="failed to get container status \"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671\": rpc error: code = NotFound desc = could not find container \"87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671\": container with ID starting with 87bcebaceb595ec403e81aaa5fffa9154881610a79d28fb3cc8af6166aa4a671 not found: ID does not exist" Dec 03 19:56:04.974422 master-0 kubenswrapper[9368]: I1203 19:56:04.974382 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb"] Dec 03 19:56:04.987648 master-0 kubenswrapper[9368]: I1203 19:56:04.987602 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:05.019240 master-0 kubenswrapper[9368]: W1203 19:56:05.013752 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod57c8b80e_76a2_4021_a94d_329689a6ae77.slice/crio-78e39900d57bd0cc50509a32a805aacc896f9b41d8084d5242f5aee8df82f861 WatchSource:0}: Error finding container 78e39900d57bd0cc50509a32a805aacc896f9b41d8084d5242f5aee8df82f861: Status 404 returned error can't find the container with id 78e39900d57bd0cc50509a32a805aacc896f9b41d8084d5242f5aee8df82f861 Dec 03 19:56:05.019240 master-0 kubenswrapper[9368]: W1203 19:56:05.014342 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96c70ce_314a_4919_91e9_cc776a620846.slice/crio-d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425 WatchSource:0}: Error finding container d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425: Status 404 returned error can't find the container with id d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425 Dec 03 19:56:05.048697 master-0 kubenswrapper[9368]: I1203 19:56:05.046453 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-zbl42"] Dec 03 19:56:05.052827 master-0 kubenswrapper[9368]: I1203 19:56:05.050437 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-zbl42"] Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.090151 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd"] Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: E1203 19:56:05.090306 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c45d22f-1492-47d7-83b6-6dd356a8454d" containerName="cluster-version-operator" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.090317 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c45d22f-1492-47d7-83b6-6dd356a8454d" containerName="cluster-version-operator" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.090389 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c45d22f-1492-47d7-83b6-6dd356a8454d" containerName="cluster-version-operator" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.090696 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.093279 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.093381 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 19:56:05.117366 master-0 kubenswrapper[9368]: I1203 19:56:05.094406 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 19:56:05.124291 master-0 kubenswrapper[9368]: I1203 19:56:05.124248 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.124291 master-0 kubenswrapper[9368]: I1203 19:56:05.124290 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.124397 master-0 kubenswrapper[9368]: I1203 19:56:05.124335 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.124397 master-0 kubenswrapper[9368]: I1203 19:56:05.124356 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.124397 master-0 kubenswrapper[9368]: I1203 19:56:05.124383 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.165146 master-0 kubenswrapper[9368]: I1203 19:56:05.165017 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:56:05.165250 master-0 kubenswrapper[9368]: I1203 19:56:05.165171 9368 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:56:05.191074 master-0 kubenswrapper[9368]: I1203 19:56:05.191040 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 19:56:05.225119 master-0 kubenswrapper[9368]: I1203 19:56:05.225078 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225249 master-0 kubenswrapper[9368]: I1203 19:56:05.225123 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225249 master-0 kubenswrapper[9368]: I1203 19:56:05.225154 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225249 master-0 kubenswrapper[9368]: I1203 19:56:05.225210 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225759 master-0 kubenswrapper[9368]: I1203 19:56:05.225259 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225759 master-0 kubenswrapper[9368]: I1203 19:56:05.225282 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.225912 master-0 kubenswrapper[9368]: I1203 19:56:05.225796 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.226384 master-0 kubenswrapper[9368]: I1203 19:56:05.226339 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.229736 master-0 kubenswrapper[9368]: I1203 19:56:05.229680 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.243721 master-0 kubenswrapper[9368]: I1203 19:56:05.243616 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.432066 master-0 kubenswrapper[9368]: I1203 19:56:05.432003 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 19:56:05.490946 master-0 kubenswrapper[9368]: W1203 19:56:05.490895 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8709c6c_8729_4702_a3fb_35a072855096.slice/crio-42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb WatchSource:0}: Error finding container 42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb: Status 404 returned error can't find the container with id 42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb Dec 03 19:56:05.525463 master-0 kubenswrapper[9368]: I1203 19:56:05.525162 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 19:56:05.910430 master-0 kubenswrapper[9368]: I1203 19:56:05.910237 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 19:56:05.911095 master-0 kubenswrapper[9368]: I1203 19:56:05.911036 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:05.920949 master-0 kubenswrapper[9368]: I1203 19:56:05.920881 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"57c8b80e-76a2-4021-a94d-329689a6ae77","Type":"ContainerStarted","Data":"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3"} Dec 03 19:56:05.920949 master-0 kubenswrapper[9368]: I1203 19:56:05.920941 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"57c8b80e-76a2-4021-a94d-329689a6ae77","Type":"ContainerStarted","Data":"78e39900d57bd0cc50509a32a805aacc896f9b41d8084d5242f5aee8df82f861"} Dec 03 19:56:05.955404 master-0 kubenswrapper[9368]: I1203 19:56:05.921008 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 19:56:05.955404 master-0 kubenswrapper[9368]: I1203 19:56:05.929307 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" event={"ID":"f96c70ce-314a-4919-91e9-cc776a620846","Type":"ContainerStarted","Data":"d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425"} Dec 03 19:56:05.967847 master-0 kubenswrapper[9368]: I1203 19:56:05.966042 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-b46c54696-bgb45" event={"ID":"c593a75e-c2af-4419-94da-e0c9ff14c41f","Type":"ContainerStarted","Data":"3c4779d21d8f2bc4c966737b0cf7224ac21f30ab6d08f8762942e259f8b3d374"} Dec 03 19:56:05.967847 master-0 kubenswrapper[9368]: I1203 19:56:05.966080 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-b46c54696-bgb45" event={"ID":"c593a75e-c2af-4419-94da-e0c9ff14c41f","Type":"ContainerStarted","Data":"5388804769e1840e70f85b087c8c2e7b0db3b73cb7342b147d2cb5a4c16f2afa"} Dec 03 19:56:05.969414 master-0 kubenswrapper[9368]: I1203 19:56:05.969365 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" event={"ID":"1f82c7a1-ec21-497d-86f2-562cafa7ace7","Type":"ContainerStarted","Data":"026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd"} Dec 03 19:56:05.969476 master-0 kubenswrapper[9368]: I1203 19:56:05.969417 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" event={"ID":"1f82c7a1-ec21-497d-86f2-562cafa7ace7","Type":"ContainerStarted","Data":"2fffba85103e52018a172418fe2ab2867595c438bee56e8e6e0ef68aad03ce36"} Dec 03 19:56:05.969476 master-0 kubenswrapper[9368]: I1203 19:56:05.969428 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" event={"ID":"1f82c7a1-ec21-497d-86f2-562cafa7ace7","Type":"ContainerStarted","Data":"dbc9d9f3c90ebc5bbbfe36c2028e07277634315bcc3781675056eb652072f16a"} Dec 03 19:56:05.970038 master-0 kubenswrapper[9368]: I1203 19:56:05.970010 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:56:05.971454 master-0 kubenswrapper[9368]: I1203 19:56:05.971378 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" event={"ID":"73b7027e-44f5-4c7b-9226-585a90530535","Type":"ContainerStarted","Data":"6ddd5106c28e189ef6e1099b648c9c04324b531aad0d0aa2c20c8039434b05e8"} Dec 03 19:56:05.971454 master-0 kubenswrapper[9368]: I1203 19:56:05.971408 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" event={"ID":"73b7027e-44f5-4c7b-9226-585a90530535","Type":"ContainerStarted","Data":"3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2"} Dec 03 19:56:05.971454 master-0 kubenswrapper[9368]: I1203 19:56:05.971418 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" event={"ID":"73b7027e-44f5-4c7b-9226-585a90530535","Type":"ContainerStarted","Data":"e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70"} Dec 03 19:56:05.971912 master-0 kubenswrapper[9368]: I1203 19:56:05.971829 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:56:05.973161 master-0 kubenswrapper[9368]: I1203 19:56:05.973128 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dbfhg" event={"ID":"d196dca7-f940-4aa0-b20a-214d22b62db6","Type":"ContainerStarted","Data":"39a89f32c88302244d5300fb97a2acc13f063cf38ae84b25b9b65189f2e36b93"} Dec 03 19:56:05.973292 master-0 kubenswrapper[9368]: I1203 19:56:05.973257 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dbfhg" Dec 03 19:56:05.978358 master-0 kubenswrapper[9368]: I1203 19:56:05.977890 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=3.977873094 podStartE2EDuration="3.977873094s" podCreationTimestamp="2025-12-03 19:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:05.977172636 +0000 UTC m=+31.638422547" watchObservedRunningTime="2025-12-03 19:56:05.977873094 +0000 UTC m=+31.639123005" Dec 03 19:56:05.978358 master-0 kubenswrapper[9368]: I1203 19:56:05.978173 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" event={"ID":"b8709c6c-8729-4702-a3fb-35a072855096","Type":"ContainerStarted","Data":"f74560024271b473d288e14ac60c9ecd05f2a6752be21eac89b4a74e35f9a5d8"} Dec 03 19:56:05.978358 master-0 kubenswrapper[9368]: I1203 19:56:05.978208 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" event={"ID":"b8709c6c-8729-4702-a3fb-35a072855096","Type":"ContainerStarted","Data":"42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb"} Dec 03 19:56:05.994122 master-0 kubenswrapper[9368]: I1203 19:56:05.994052 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podStartSLOduration=8.994033694 podStartE2EDuration="8.994033694s" podCreationTimestamp="2025-12-03 19:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:05.993183253 +0000 UTC m=+31.654433174" watchObservedRunningTime="2025-12-03 19:56:05.994033694 +0000 UTC m=+31.655283605" Dec 03 19:56:06.018133 master-0 kubenswrapper[9368]: I1203 19:56:06.017754 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podStartSLOduration=9.017732636 podStartE2EDuration="9.017732636s" podCreationTimestamp="2025-12-03 19:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:06.016358681 +0000 UTC m=+31.677608602" watchObservedRunningTime="2025-12-03 19:56:06.017732636 +0000 UTC m=+31.678982557" Dec 03 19:56:06.046802 master-0 kubenswrapper[9368]: I1203 19:56:06.042575 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dbfhg" podStartSLOduration=7.163717224 podStartE2EDuration="15.042560347s" podCreationTimestamp="2025-12-03 19:55:51 +0000 UTC" firstStartedPulling="2025-12-03 19:55:56.540867214 +0000 UTC m=+22.202117125" lastFinishedPulling="2025-12-03 19:56:04.419710297 +0000 UTC m=+30.080960248" observedRunningTime="2025-12-03 19:56:06.041326636 +0000 UTC m=+31.702576547" watchObservedRunningTime="2025-12-03 19:56:06.042560347 +0000 UTC m=+31.703810258" Dec 03 19:56:06.054369 master-0 kubenswrapper[9368]: I1203 19:56:06.054308 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.055593 master-0 kubenswrapper[9368]: I1203 19:56:06.054654 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.055593 master-0 kubenswrapper[9368]: I1203 19:56:06.054838 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvzh\" (UniqueName: \"kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.072459 master-0 kubenswrapper[9368]: I1203 19:56:06.072343 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-b46c54696-bgb45" podStartSLOduration=9.236817791 podStartE2EDuration="17.072324063s" podCreationTimestamp="2025-12-03 19:55:49 +0000 UTC" firstStartedPulling="2025-12-03 19:55:56.607920897 +0000 UTC m=+22.269170808" lastFinishedPulling="2025-12-03 19:56:04.443427169 +0000 UTC m=+30.104677080" observedRunningTime="2025-12-03 19:56:06.070858935 +0000 UTC m=+31.732108866" watchObservedRunningTime="2025-12-03 19:56:06.072324063 +0000 UTC m=+31.733573994" Dec 03 19:56:06.101799 master-0 kubenswrapper[9368]: I1203 19:56:06.101652 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" podStartSLOduration=1.101631997 podStartE2EDuration="1.101631997s" podCreationTimestamp="2025-12-03 19:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:06.099962325 +0000 UTC m=+31.761212246" watchObservedRunningTime="2025-12-03 19:56:06.101631997 +0000 UTC m=+31.762881908" Dec 03 19:56:06.117899 master-0 kubenswrapper[9368]: I1203 19:56:06.117847 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 19:56:06.121237 master-0 kubenswrapper[9368]: I1203 19:56:06.118899 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.136556 master-0 kubenswrapper[9368]: I1203 19:56:06.136490 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 19:56:06.156540 master-0 kubenswrapper[9368]: I1203 19:56:06.156396 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.157240 master-0 kubenswrapper[9368]: I1203 19:56:06.156696 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvzh\" (UniqueName: \"kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.157240 master-0 kubenswrapper[9368]: I1203 19:56:06.156754 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.157240 master-0 kubenswrapper[9368]: I1203 19:56:06.156821 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.157240 master-0 kubenswrapper[9368]: I1203 19:56:06.157018 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.175239 master-0 kubenswrapper[9368]: I1203 19:56:06.175135 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvzh\" (UniqueName: \"kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh\") pod \"certified-operators-sp868\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.261798 master-0 kubenswrapper[9368]: I1203 19:56:06.258480 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.261798 master-0 kubenswrapper[9368]: I1203 19:56:06.258598 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7hr\" (UniqueName: \"kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.261798 master-0 kubenswrapper[9368]: I1203 19:56:06.258651 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.269795 master-0 kubenswrapper[9368]: I1203 19:56:06.266217 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:56:06.361057 master-0 kubenswrapper[9368]: I1203 19:56:06.360357 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7hr\" (UniqueName: \"kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.361057 master-0 kubenswrapper[9368]: I1203 19:56:06.360433 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.361057 master-0 kubenswrapper[9368]: I1203 19:56:06.360505 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.361057 master-0 kubenswrapper[9368]: I1203 19:56:06.361013 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.361057 master-0 kubenswrapper[9368]: I1203 19:56:06.361060 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.375464 master-0 kubenswrapper[9368]: I1203 19:56:06.375427 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7hr\" (UniqueName: \"kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr\") pod \"community-operators-r2c8x\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.445311 master-0 kubenswrapper[9368]: I1203 19:56:06.445211 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 19:56:06.565472 master-0 kubenswrapper[9368]: I1203 19:56:06.564414 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c45d22f-1492-47d7-83b6-6dd356a8454d" path="/var/lib/kubelet/pods/0c45d22f-1492-47d7-83b6-6dd356a8454d/volumes" Dec 03 19:56:06.859375 master-0 kubenswrapper[9368]: I1203 19:56:06.859309 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 19:56:06.900244 master-0 kubenswrapper[9368]: I1203 19:56:06.900212 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 19:56:06.910149 master-0 kubenswrapper[9368]: W1203 19:56:06.910113 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb1d894_1bc0_478d_87fc_e9137291df70.slice/crio-71864668e71ee0adcfe271632cee980c0921d9d37de64e40c034340e1013deba WatchSource:0}: Error finding container 71864668e71ee0adcfe271632cee980c0921d9d37de64e40c034340e1013deba: Status 404 returned error can't find the container with id 71864668e71ee0adcfe271632cee980c0921d9d37de64e40c034340e1013deba Dec 03 19:56:06.990524 master-0 kubenswrapper[9368]: I1203 19:56:06.990482 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerStarted","Data":"11032c235e99b15636a3920f484a0a8fd50c568319f66ba43948c41f56636e33"} Dec 03 19:56:06.990524 master-0 kubenswrapper[9368]: I1203 19:56:06.990522 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerStarted","Data":"97db26d863bf0ebdc932c5639db85fc3842260317e397675b72f82e6a0ecb736"} Dec 03 19:56:06.993289 master-0 kubenswrapper[9368]: I1203 19:56:06.993252 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerStarted","Data":"71864668e71ee0adcfe271632cee980c0921d9d37de64e40c034340e1013deba"} Dec 03 19:56:07.491467 master-0 kubenswrapper[9368]: I1203 19:56:07.491423 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:07.491823 master-0 kubenswrapper[9368]: I1203 19:56:07.491750 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:07.492079 master-0 kubenswrapper[9368]: E1203 19:56:07.492059 9368 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:56:07.492212 master-0 kubenswrapper[9368]: E1203 19:56:07.492200 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca podName:82718569-4870-4f94-b2e7-7ccd7d4de8ff nodeName:}" failed. No retries permitted until 2025-12-03 19:56:39.492182267 +0000 UTC m=+65.153432178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca") pod "route-controller-manager-54bbbcd887-h4khj" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff") : configmap "client-ca" not found Dec 03 19:56:07.495361 master-0 kubenswrapper[9368]: I1203 19:56:07.495339 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"route-controller-manager-54bbbcd887-h4khj\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:07.510614 master-0 kubenswrapper[9368]: I1203 19:56:07.510479 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 19:56:07.511568 master-0 kubenswrapper[9368]: I1203 19:56:07.511548 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.519612 master-0 kubenswrapper[9368]: I1203 19:56:07.519583 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 19:56:07.593481 master-0 kubenswrapper[9368]: I1203 19:56:07.593441 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.593481 master-0 kubenswrapper[9368]: I1203 19:56:07.593488 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.593724 master-0 kubenswrapper[9368]: I1203 19:56:07.593549 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcc5\" (UniqueName: \"kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.695124 master-0 kubenswrapper[9368]: I1203 19:56:07.695080 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcc5\" (UniqueName: \"kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.695350 master-0 kubenswrapper[9368]: I1203 19:56:07.695159 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.695350 master-0 kubenswrapper[9368]: I1203 19:56:07.695192 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.695669 master-0 kubenswrapper[9368]: I1203 19:56:07.695646 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:07.695742 master-0 kubenswrapper[9368]: I1203 19:56:07.695706 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:08.016259 master-0 kubenswrapper[9368]: I1203 19:56:08.016162 9368 generic.go:334] "Generic (PLEG): container finished" podID="acb1d894-1bc0-478d-87fc-e9137291df70" containerID="379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9" exitCode=0 Dec 03 19:56:08.017184 master-0 kubenswrapper[9368]: I1203 19:56:08.016268 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerDied","Data":"379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9"} Dec 03 19:56:08.018318 master-0 kubenswrapper[9368]: I1203 19:56:08.017939 9368 generic.go:334] "Generic (PLEG): container finished" podID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerID="11032c235e99b15636a3920f484a0a8fd50c568319f66ba43948c41f56636e33" exitCode=0 Dec 03 19:56:08.018554 master-0 kubenswrapper[9368]: I1203 19:56:08.018507 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerDied","Data":"11032c235e99b15636a3920f484a0a8fd50c568319f66ba43948c41f56636e33"} Dec 03 19:56:08.086432 master-0 kubenswrapper[9368]: I1203 19:56:08.085211 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:08.086432 master-0 kubenswrapper[9368]: I1203 19:56:08.085525 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="57c8b80e-76a2-4021-a94d-329689a6ae77" containerName="installer" containerID="cri-o://ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3" gracePeriod=30 Dec 03 19:56:08.122487 master-0 kubenswrapper[9368]: I1203 19:56:08.121690 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcc5\" (UniqueName: \"kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5\") pod \"redhat-marketplace-mc8kx\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:08.132430 master-0 kubenswrapper[9368]: I1203 19:56:08.132361 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 19:56:08.559174 master-0 kubenswrapper[9368]: I1203 19:56:08.559097 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_57c8b80e-76a2-4021-a94d-329689a6ae77/installer/0.log" Dec 03 19:56:08.559174 master-0 kubenswrapper[9368]: I1203 19:56:08.559159 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:08.594060 master-0 kubenswrapper[9368]: I1203 19:56:08.594007 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 19:56:08.705370 master-0 kubenswrapper[9368]: I1203 19:56:08.705315 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 19:56:08.705545 master-0 kubenswrapper[9368]: E1203 19:56:08.705526 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c8b80e-76a2-4021-a94d-329689a6ae77" containerName="installer" Dec 03 19:56:08.705545 master-0 kubenswrapper[9368]: I1203 19:56:08.705542 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c8b80e-76a2-4021-a94d-329689a6ae77" containerName="installer" Dec 03 19:56:08.705714 master-0 kubenswrapper[9368]: I1203 19:56:08.705697 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c8b80e-76a2-4021-a94d-329689a6ae77" containerName="installer" Dec 03 19:56:08.706319 master-0 kubenswrapper[9368]: I1203 19:56:08.706292 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.722691 master-0 kubenswrapper[9368]: I1203 19:56:08.722634 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 19:56:08.735152 master-0 kubenswrapper[9368]: I1203 19:56:08.735047 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access\") pod \"57c8b80e-76a2-4021-a94d-329689a6ae77\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " Dec 03 19:56:08.735263 master-0 kubenswrapper[9368]: I1203 19:56:08.735234 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock\") pod \"57c8b80e-76a2-4021-a94d-329689a6ae77\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " Dec 03 19:56:08.735344 master-0 kubenswrapper[9368]: I1203 19:56:08.735274 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir\") pod \"57c8b80e-76a2-4021-a94d-329689a6ae77\" (UID: \"57c8b80e-76a2-4021-a94d-329689a6ae77\") " Dec 03 19:56:08.735569 master-0 kubenswrapper[9368]: I1203 19:56:08.735532 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") pod \"controller-manager-6877695c95-4kmf4\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:56:08.735704 master-0 kubenswrapper[9368]: E1203 19:56:08.735684 9368 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 19:56:08.735851 master-0 kubenswrapper[9368]: E1203 19:56:08.735735 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca podName:a2233205-0100-449b-81ee-7e13551adf6f nodeName:}" failed. No retries permitted until 2025-12-03 19:56:24.735718854 +0000 UTC m=+50.396968765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca") pod "controller-manager-6877695c95-4kmf4" (UID: "a2233205-0100-449b-81ee-7e13551adf6f") : configmap "client-ca" not found Dec 03 19:56:08.735851 master-0 kubenswrapper[9368]: I1203 19:56:08.735792 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock" (OuterVolumeSpecName: "var-lock") pod "57c8b80e-76a2-4021-a94d-329689a6ae77" (UID: "57c8b80e-76a2-4021-a94d-329689a6ae77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:08.735851 master-0 kubenswrapper[9368]: I1203 19:56:08.735819 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "57c8b80e-76a2-4021-a94d-329689a6ae77" (UID: "57c8b80e-76a2-4021-a94d-329689a6ae77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:08.738846 master-0 kubenswrapper[9368]: I1203 19:56:08.738769 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "57c8b80e-76a2-4021-a94d-329689a6ae77" (UID: "57c8b80e-76a2-4021-a94d-329689a6ae77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836312 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836419 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836458 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtpl\" (UniqueName: \"kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836533 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57c8b80e-76a2-4021-a94d-329689a6ae77-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836544 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:08.837510 master-0 kubenswrapper[9368]: I1203 19:56:08.836553 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/57c8b80e-76a2-4021-a94d-329689a6ae77-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:08.938079 master-0 kubenswrapper[9368]: I1203 19:56:08.937905 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.938079 master-0 kubenswrapper[9368]: I1203 19:56:08.937991 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.938079 master-0 kubenswrapper[9368]: I1203 19:56:08.938028 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtpl\" (UniqueName: \"kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.938571 master-0 kubenswrapper[9368]: I1203 19:56:08.938475 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.938789 master-0 kubenswrapper[9368]: I1203 19:56:08.938639 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:08.952837 master-0 kubenswrapper[9368]: I1203 19:56:08.952707 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtpl\" (UniqueName: \"kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl\") pod \"redhat-operators-6zrxk\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:09.024262 master-0 kubenswrapper[9368]: I1203 19:56:09.024218 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 19:56:09.024711 master-0 kubenswrapper[9368]: I1203 19:56:09.024576 9368 generic.go:334] "Generic (PLEG): container finished" podID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerID="6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057" exitCode=0 Dec 03 19:56:09.024711 master-0 kubenswrapper[9368]: I1203 19:56:09.024678 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerDied","Data":"6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057"} Dec 03 19:56:09.024711 master-0 kubenswrapper[9368]: I1203 19:56:09.024704 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerStarted","Data":"148d3d0ae63a175305173f860008d660572daa7838487974f2a9f003f59eeff0"} Dec 03 19:56:09.027224 master-0 kubenswrapper[9368]: I1203 19:56:09.027182 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_57c8b80e-76a2-4021-a94d-329689a6ae77/installer/0.log" Dec 03 19:56:09.027298 master-0 kubenswrapper[9368]: I1203 19:56:09.027249 9368 generic.go:334] "Generic (PLEG): container finished" podID="57c8b80e-76a2-4021-a94d-329689a6ae77" containerID="ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3" exitCode=1 Dec 03 19:56:09.027331 master-0 kubenswrapper[9368]: I1203 19:56:09.027297 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 19:56:09.027359 master-0 kubenswrapper[9368]: I1203 19:56:09.027329 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"57c8b80e-76a2-4021-a94d-329689a6ae77","Type":"ContainerDied","Data":"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3"} Dec 03 19:56:09.027398 master-0 kubenswrapper[9368]: I1203 19:56:09.027379 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"57c8b80e-76a2-4021-a94d-329689a6ae77","Type":"ContainerDied","Data":"78e39900d57bd0cc50509a32a805aacc896f9b41d8084d5242f5aee8df82f861"} Dec 03 19:56:09.027460 master-0 kubenswrapper[9368]: I1203 19:56:09.027420 9368 scope.go:117] "RemoveContainer" containerID="ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3" Dec 03 19:56:09.029525 master-0 kubenswrapper[9368]: I1203 19:56:09.029183 9368 generic.go:334] "Generic (PLEG): container finished" podID="f96c70ce-314a-4919-91e9-cc776a620846" containerID="e2b66d198b3f4fe0e6018d9d1aa589a9d8ed0ff0683d77115b9a3013153ec256" exitCode=0 Dec 03 19:56:09.029525 master-0 kubenswrapper[9368]: I1203 19:56:09.029214 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" event={"ID":"f96c70ce-314a-4919-91e9-cc776a620846","Type":"ContainerDied","Data":"e2b66d198b3f4fe0e6018d9d1aa589a9d8ed0ff0683d77115b9a3013153ec256"} Dec 03 19:56:09.049114 master-0 kubenswrapper[9368]: I1203 19:56:09.048628 9368 scope.go:117] "RemoveContainer" containerID="ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3" Dec 03 19:56:09.049167 master-0 kubenswrapper[9368]: E1203 19:56:09.049109 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3\": container with ID starting with ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3 not found: ID does not exist" containerID="ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3" Dec 03 19:56:09.049167 master-0 kubenswrapper[9368]: I1203 19:56:09.049137 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3"} err="failed to get container status \"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3\": rpc error: code = NotFound desc = could not find container \"ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3\": container with ID starting with ee4f175a218e9a194bdf882bfe97c9b5996e8b86be2b8fc0a0d51a2b228723a3 not found: ID does not exist" Dec 03 19:56:09.057555 master-0 kubenswrapper[9368]: I1203 19:56:09.057514 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:09.061194 master-0 kubenswrapper[9368]: I1203 19:56:09.061166 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 19:56:09.459333 master-0 kubenswrapper[9368]: I1203 19:56:09.459299 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 19:56:09.737446 master-0 kubenswrapper[9368]: I1203 19:56:09.736895 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:56:09.737446 master-0 kubenswrapper[9368]: I1203 19:56:09.736973 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:56:09.743760 master-0 kubenswrapper[9368]: I1203 19:56:09.743667 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:56:10.039824 master-0 kubenswrapper[9368]: I1203 19:56:10.039688 9368 generic.go:334] "Generic (PLEG): container finished" podID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerID="e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3" exitCode=0 Dec 03 19:56:10.040909 master-0 kubenswrapper[9368]: I1203 19:56:10.039887 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerDied","Data":"e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3"} Dec 03 19:56:10.040909 master-0 kubenswrapper[9368]: I1203 19:56:10.039952 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerStarted","Data":"cd824949fd38781abd9d180d17b219a566658c17f085e62ea63db83c4ab6d2d5"} Dec 03 19:56:10.043467 master-0 kubenswrapper[9368]: I1203 19:56:10.043426 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" event={"ID":"f96c70ce-314a-4919-91e9-cc776a620846","Type":"ContainerStarted","Data":"008ac7975eb212a7ea064a8e38a72df42f40e0306961b55aa18940280cbfe7c6"} Dec 03 19:56:10.049268 master-0 kubenswrapper[9368]: I1203 19:56:10.049160 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 19:56:10.130731 master-0 kubenswrapper[9368]: I1203 19:56:10.130620 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" podStartSLOduration=9.011815597 podStartE2EDuration="12.130593033s" podCreationTimestamp="2025-12-03 19:55:58 +0000 UTC" firstStartedPulling="2025-12-03 19:56:05.044980739 +0000 UTC m=+30.706230650" lastFinishedPulling="2025-12-03 19:56:08.163758165 +0000 UTC m=+33.825008086" observedRunningTime="2025-12-03 19:56:10.104670194 +0000 UTC m=+35.765920105" watchObservedRunningTime="2025-12-03 19:56:10.130593033 +0000 UTC m=+35.791842954" Dec 03 19:56:10.374263 master-0 kubenswrapper[9368]: I1203 19:56:10.373809 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6877695c95-4kmf4"] Dec 03 19:56:10.374263 master-0 kubenswrapper[9368]: E1203 19:56:10.374175 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" podUID="a2233205-0100-449b-81ee-7e13551adf6f" Dec 03 19:56:10.388374 master-0 kubenswrapper[9368]: I1203 19:56:10.387498 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj"] Dec 03 19:56:10.388374 master-0 kubenswrapper[9368]: E1203 19:56:10.387823 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" podUID="82718569-4870-4f94-b2e7-7ccd7d4de8ff" Dec 03 19:56:10.560101 master-0 kubenswrapper[9368]: I1203 19:56:10.560053 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c8b80e-76a2-4021-a94d-329689a6ae77" path="/var/lib/kubelet/pods/57c8b80e-76a2-4021-a94d-329689a6ae77/volumes" Dec 03 19:56:10.816429 master-0 kubenswrapper[9368]: I1203 19:56:10.813096 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:10.816429 master-0 kubenswrapper[9368]: I1203 19:56:10.813837 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:10.827672 master-0 kubenswrapper[9368]: I1203 19:56:10.827636 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:10.965602 master-0 kubenswrapper[9368]: I1203 19:56:10.965519 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:10.965602 master-0 kubenswrapper[9368]: I1203 19:56:10.965573 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:10.965602 master-0 kubenswrapper[9368]: I1203 19:56:10.965590 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.047731 master-0 kubenswrapper[9368]: I1203 19:56:11.047687 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:11.048901 master-0 kubenswrapper[9368]: I1203 19:56:11.048878 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:56:11.054496 master-0 kubenswrapper[9368]: I1203 19:56:11.054468 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:11.059867 master-0 kubenswrapper[9368]: I1203 19:56:11.059838 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:56:11.067206 master-0 kubenswrapper[9368]: I1203 19:56:11.067147 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.067206 master-0 kubenswrapper[9368]: I1203 19:56:11.067195 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.067304 master-0 kubenswrapper[9368]: I1203 19:56:11.067214 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.067304 master-0 kubenswrapper[9368]: I1203 19:56:11.067264 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.067304 master-0 kubenswrapper[9368]: I1203 19:56:11.067273 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.083357 master-0 kubenswrapper[9368]: I1203 19:56:11.083326 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access\") pod \"installer-3-master-0\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.139557 master-0 kubenswrapper[9368]: I1203 19:56:11.139154 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:11.170357 master-0 kubenswrapper[9368]: I1203 19:56:11.170307 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") pod \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " Dec 03 19:56:11.170357 master-0 kubenswrapper[9368]: I1203 19:56:11.170366 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jczb\" (UniqueName: \"kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb\") pod \"a2233205-0100-449b-81ee-7e13551adf6f\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " Dec 03 19:56:11.170622 master-0 kubenswrapper[9368]: I1203 19:56:11.170403 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj5hc\" (UniqueName: \"kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc\") pod \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " Dec 03 19:56:11.170622 master-0 kubenswrapper[9368]: I1203 19:56:11.170447 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config\") pod \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\" (UID: \"82718569-4870-4f94-b2e7-7ccd7d4de8ff\") " Dec 03 19:56:11.170622 master-0 kubenswrapper[9368]: I1203 19:56:11.170491 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles\") pod \"a2233205-0100-449b-81ee-7e13551adf6f\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " Dec 03 19:56:11.170622 master-0 kubenswrapper[9368]: I1203 19:56:11.170543 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert\") pod \"a2233205-0100-449b-81ee-7e13551adf6f\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " Dec 03 19:56:11.170622 master-0 kubenswrapper[9368]: I1203 19:56:11.170577 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config\") pod \"a2233205-0100-449b-81ee-7e13551adf6f\" (UID: \"a2233205-0100-449b-81ee-7e13551adf6f\") " Dec 03 19:56:11.172907 master-0 kubenswrapper[9368]: I1203 19:56:11.172522 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config" (OuterVolumeSpecName: "config") pod "a2233205-0100-449b-81ee-7e13551adf6f" (UID: "a2233205-0100-449b-81ee-7e13551adf6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:11.173427 master-0 kubenswrapper[9368]: I1203 19:56:11.173272 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config" (OuterVolumeSpecName: "config") pod "82718569-4870-4f94-b2e7-7ccd7d4de8ff" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:11.173848 master-0 kubenswrapper[9368]: I1203 19:56:11.173721 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a2233205-0100-449b-81ee-7e13551adf6f" (UID: "a2233205-0100-449b-81ee-7e13551adf6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:11.174411 master-0 kubenswrapper[9368]: I1203 19:56:11.174330 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc" (OuterVolumeSpecName: "kube-api-access-zj5hc") pod "82718569-4870-4f94-b2e7-7ccd7d4de8ff" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff"). InnerVolumeSpecName "kube-api-access-zj5hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:11.174759 master-0 kubenswrapper[9368]: I1203 19:56:11.174575 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb" (OuterVolumeSpecName: "kube-api-access-4jczb") pod "a2233205-0100-449b-81ee-7e13551adf6f" (UID: "a2233205-0100-449b-81ee-7e13551adf6f"). InnerVolumeSpecName "kube-api-access-4jczb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:11.175158 master-0 kubenswrapper[9368]: I1203 19:56:11.175071 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "82718569-4870-4f94-b2e7-7ccd7d4de8ff" (UID: "82718569-4870-4f94-b2e7-7ccd7d4de8ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:56:11.176114 master-0 kubenswrapper[9368]: I1203 19:56:11.176051 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2233205-0100-449b-81ee-7e13551adf6f" (UID: "a2233205-0100-449b-81ee-7e13551adf6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:56:11.271726 master-0 kubenswrapper[9368]: I1203 19:56:11.271679 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.271726 master-0 kubenswrapper[9368]: I1203 19:56:11.271712 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82718569-4870-4f94-b2e7-7ccd7d4de8ff-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.271726 master-0 kubenswrapper[9368]: I1203 19:56:11.271724 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jczb\" (UniqueName: \"kubernetes.io/projected/a2233205-0100-449b-81ee-7e13551adf6f-kube-api-access-4jczb\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.271726 master-0 kubenswrapper[9368]: I1203 19:56:11.271733 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj5hc\" (UniqueName: \"kubernetes.io/projected/82718569-4870-4f94-b2e7-7ccd7d4de8ff-kube-api-access-zj5hc\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.271726 master-0 kubenswrapper[9368]: I1203 19:56:11.271742 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.272107 master-0 kubenswrapper[9368]: I1203 19:56:11.271751 9368 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.272107 master-0 kubenswrapper[9368]: I1203 19:56:11.271763 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2233205-0100-449b-81ee-7e13551adf6f-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:11.596462 master-0 kubenswrapper[9368]: I1203 19:56:11.596406 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 19:56:11.597042 master-0 kubenswrapper[9368]: I1203 19:56:11.597022 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.599003 master-0 kubenswrapper[9368]: I1203 19:56:11.598971 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 19:56:11.627662 master-0 kubenswrapper[9368]: I1203 19:56:11.627610 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 19:56:11.777374 master-0 kubenswrapper[9368]: I1203 19:56:11.777293 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.777653 master-0 kubenswrapper[9368]: I1203 19:56:11.777386 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.777653 master-0 kubenswrapper[9368]: I1203 19:56:11.777439 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.878515 master-0 kubenswrapper[9368]: I1203 19:56:11.878378 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.878515 master-0 kubenswrapper[9368]: I1203 19:56:11.878478 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.878515 master-0 kubenswrapper[9368]: I1203 19:56:11.878522 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.878936 master-0 kubenswrapper[9368]: I1203 19:56:11.878597 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.878936 master-0 kubenswrapper[9368]: I1203 19:56:11.878610 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.895692 master-0 kubenswrapper[9368]: I1203 19:56:11.895649 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:11.932982 master-0 kubenswrapper[9368]: I1203 19:56:11.932921 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:56:12.052868 master-0 kubenswrapper[9368]: I1203 19:56:12.052417 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6877695c95-4kmf4" Dec 03 19:56:12.056309 master-0 kubenswrapper[9368]: I1203 19:56:12.056279 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj" Dec 03 19:56:12.130358 master-0 kubenswrapper[9368]: I1203 19:56:12.129355 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:12.130358 master-0 kubenswrapper[9368]: I1203 19:56:12.129935 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.133848 master-0 kubenswrapper[9368]: I1203 19:56:12.133704 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 19:56:12.133913 master-0 kubenswrapper[9368]: I1203 19:56:12.133886 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 19:56:12.134900 master-0 kubenswrapper[9368]: I1203 19:56:12.134881 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 19:56:12.138579 master-0 kubenswrapper[9368]: I1203 19:56:12.138482 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 19:56:12.142802 master-0 kubenswrapper[9368]: I1203 19:56:12.142746 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6877695c95-4kmf4"] Dec 03 19:56:12.143693 master-0 kubenswrapper[9368]: I1203 19:56:12.143658 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6877695c95-4kmf4"] Dec 03 19:56:12.143842 master-0 kubenswrapper[9368]: I1203 19:56:12.143818 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 19:56:12.144460 master-0 kubenswrapper[9368]: I1203 19:56:12.144312 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 19:56:12.146040 master-0 kubenswrapper[9368]: I1203 19:56:12.146018 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:12.176430 master-0 kubenswrapper[9368]: I1203 19:56:12.176375 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj"] Dec 03 19:56:12.179265 master-0 kubenswrapper[9368]: I1203 19:56:12.179217 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bbbcd887-h4khj"] Dec 03 19:56:12.282538 master-0 kubenswrapper[9368]: I1203 19:56:12.282477 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.282538 master-0 kubenswrapper[9368]: I1203 19:56:12.282532 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.282752 master-0 kubenswrapper[9368]: I1203 19:56:12.282704 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.282814 master-0 kubenswrapper[9368]: I1203 19:56:12.282788 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.282991 master-0 kubenswrapper[9368]: I1203 19:56:12.282904 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.282991 master-0 kubenswrapper[9368]: I1203 19:56:12.282965 9368 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2233205-0100-449b-81ee-7e13551adf6f-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:12.282991 master-0 kubenswrapper[9368]: I1203 19:56:12.282977 9368 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82718569-4870-4f94-b2e7-7ccd7d4de8ff-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:12.384052 master-0 kubenswrapper[9368]: I1203 19:56:12.383948 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.384944 master-0 kubenswrapper[9368]: I1203 19:56:12.384911 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.384998 master-0 kubenswrapper[9368]: I1203 19:56:12.384980 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.385075 master-0 kubenswrapper[9368]: I1203 19:56:12.385046 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.385138 master-0 kubenswrapper[9368]: I1203 19:56:12.385091 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.385200 master-0 kubenswrapper[9368]: I1203 19:56:12.385173 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.385786 master-0 kubenswrapper[9368]: I1203 19:56:12.385736 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.386419 master-0 kubenswrapper[9368]: I1203 19:56:12.386387 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.555738 master-0 kubenswrapper[9368]: I1203 19:56:12.555679 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82718569-4870-4f94-b2e7-7ccd7d4de8ff" path="/var/lib/kubelet/pods/82718569-4870-4f94-b2e7-7ccd7d4de8ff/volumes" Dec 03 19:56:12.556116 master-0 kubenswrapper[9368]: I1203 19:56:12.556087 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2233205-0100-449b-81ee-7e13551adf6f" path="/var/lib/kubelet/pods/a2233205-0100-449b-81ee-7e13551adf6f/volumes" Dec 03 19:56:12.571213 master-0 kubenswrapper[9368]: I1203 19:56:12.571169 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.571478 master-0 kubenswrapper[9368]: I1203 19:56:12.571456 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv\") pod \"controller-manager-68f766fc9-lwgcg\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:12.768288 master-0 kubenswrapper[9368]: I1203 19:56:12.768161 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:14.023182 master-0 kubenswrapper[9368]: I1203 19:56:14.023091 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:56:14.023182 master-0 kubenswrapper[9368]: I1203 19:56:14.023142 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:56:15.627383 master-0 kubenswrapper[9368]: I1203 19:56:15.627302 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:56:15.638144 master-0 kubenswrapper[9368]: I1203 19:56:15.638078 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 19:56:15.641819 master-0 kubenswrapper[9368]: I1203 19:56:15.641718 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 19:56:15.643114 master-0 kubenswrapper[9368]: I1203 19:56:15.643060 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.646290 master-0 kubenswrapper[9368]: I1203 19:56:15.646224 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 19:56:15.660136 master-0 kubenswrapper[9368]: I1203 19:56:15.659832 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 19:56:15.660136 master-0 kubenswrapper[9368]: I1203 19:56:15.659958 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 19:56:15.660136 master-0 kubenswrapper[9368]: I1203 19:56:15.659845 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 19:56:15.660600 master-0 kubenswrapper[9368]: I1203 19:56:15.660522 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 19:56:15.830589 master-0 kubenswrapper[9368]: I1203 19:56:15.830332 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.830589 master-0 kubenswrapper[9368]: I1203 19:56:15.830415 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.830589 master-0 kubenswrapper[9368]: I1203 19:56:15.830517 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.830589 master-0 kubenswrapper[9368]: I1203 19:56:15.830552 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6cx\" (UniqueName: \"kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.931288 master-0 kubenswrapper[9368]: I1203 19:56:15.931246 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.931433 master-0 kubenswrapper[9368]: I1203 19:56:15.931300 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6cx\" (UniqueName: \"kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.931433 master-0 kubenswrapper[9368]: I1203 19:56:15.931337 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.931433 master-0 kubenswrapper[9368]: I1203 19:56:15.931362 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.933165 master-0 kubenswrapper[9368]: I1203 19:56:15.933110 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.934083 master-0 kubenswrapper[9368]: I1203 19:56:15.934040 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:15.936006 master-0 kubenswrapper[9368]: I1203 19:56:15.935966 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:16.386432 master-0 kubenswrapper[9368]: I1203 19:56:16.385009 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:16.386432 master-0 kubenswrapper[9368]: I1203 19:56:16.385955 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.392770 master-0 kubenswrapper[9368]: I1203 19:56:16.392595 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 19:56:16.419099 master-0 kubenswrapper[9368]: I1203 19:56:16.414382 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:16.452805 master-0 kubenswrapper[9368]: I1203 19:56:16.446397 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 19:56:16.452805 master-0 kubenswrapper[9368]: I1203 19:56:16.450086 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.452805 master-0 kubenswrapper[9368]: I1203 19:56:16.450127 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.452805 master-0 kubenswrapper[9368]: I1203 19:56:16.450151 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.460182 master-0 kubenswrapper[9368]: I1203 19:56:16.460127 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6cx\" (UniqueName: \"kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx\") pod \"route-controller-manager-869d689b5b-brqck\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:16.460440 master-0 kubenswrapper[9368]: I1203 19:56:16.460406 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dbfhg" Dec 03 19:56:16.489859 master-0 kubenswrapper[9368]: I1203 19:56:16.485534 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:16.489859 master-0 kubenswrapper[9368]: I1203 19:56:16.485595 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:16.494094 master-0 kubenswrapper[9368]: I1203 19:56:16.494055 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 19:56:16.554806 master-0 kubenswrapper[9368]: I1203 19:56:16.552957 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.554806 master-0 kubenswrapper[9368]: I1203 19:56:16.553022 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.554806 master-0 kubenswrapper[9368]: I1203 19:56:16.553040 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.554806 master-0 kubenswrapper[9368]: I1203 19:56:16.553970 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.554806 master-0 kubenswrapper[9368]: I1203 19:56:16.554029 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.575760 master-0 kubenswrapper[9368]: I1203 19:56:16.575710 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:56:16.607805 master-0 kubenswrapper[9368]: I1203 19:56:16.604708 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:16.805609 master-0 kubenswrapper[9368]: I1203 19:56:16.805500 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:17.835458 master-0 kubenswrapper[9368]: I1203 19:56:17.835389 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 19:56:17.883868 master-0 kubenswrapper[9368]: I1203 19:56:17.883809 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 19:56:19.496848 master-0 kubenswrapper[9368]: I1203 19:56:19.495944 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg"] Dec 03 19:56:19.496848 master-0 kubenswrapper[9368]: I1203 19:56:19.496838 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.498308 master-0 kubenswrapper[9368]: I1203 19:56:19.498251 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 19:56:19.502554 master-0 kubenswrapper[9368]: I1203 19:56:19.502520 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 19:56:19.503441 master-0 kubenswrapper[9368]: I1203 19:56:19.503308 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 19:56:19.510121 master-0 kubenswrapper[9368]: I1203 19:56:19.509757 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg"] Dec 03 19:56:19.606488 master-0 kubenswrapper[9368]: I1203 19:56:19.606419 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.606684 master-0 kubenswrapper[9368]: I1203 19:56:19.606641 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk5wb\" (UniqueName: \"kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.708380 master-0 kubenswrapper[9368]: I1203 19:56:19.708324 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5wb\" (UniqueName: \"kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.708568 master-0 kubenswrapper[9368]: I1203 19:56:19.708400 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.715826 master-0 kubenswrapper[9368]: I1203 19:56:19.715760 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.726236 master-0 kubenswrapper[9368]: I1203 19:56:19.726198 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5wb\" (UniqueName: \"kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:19.822298 master-0 kubenswrapper[9368]: I1203 19:56:19.822176 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 19:56:20.009348 master-0 kubenswrapper[9368]: I1203 19:56:20.009291 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:20.736066 master-0 kubenswrapper[9368]: I1203 19:56:20.736008 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:20.737679 master-0 kubenswrapper[9368]: I1203 19:56:20.737663 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.739832 master-0 kubenswrapper[9368]: I1203 19:56:20.739760 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 03 19:56:20.748820 master-0 kubenswrapper[9368]: I1203 19:56:20.748760 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:20.835869 master-0 kubenswrapper[9368]: I1203 19:56:20.835752 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.836060 master-0 kubenswrapper[9368]: I1203 19:56:20.835874 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.836060 master-0 kubenswrapper[9368]: I1203 19:56:20.835908 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.942594 master-0 kubenswrapper[9368]: I1203 19:56:20.942519 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.942811 master-0 kubenswrapper[9368]: I1203 19:56:20.942600 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.942811 master-0 kubenswrapper[9368]: I1203 19:56:20.942669 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.942811 master-0 kubenswrapper[9368]: I1203 19:56:20.942724 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.942811 master-0 kubenswrapper[9368]: I1203 19:56:20.942766 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:20.959744 master-0 kubenswrapper[9368]: I1203 19:56:20.959701 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:21.066063 master-0 kubenswrapper[9368]: I1203 19:56:21.065958 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:24.121976 master-0 kubenswrapper[9368]: I1203 19:56:24.121925 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_692f1783-2d80-48a7-af1b-58a1f3f99315/installer/0.log" Dec 03 19:56:24.122473 master-0 kubenswrapper[9368]: I1203 19:56:24.122003 9368 generic.go:334] "Generic (PLEG): container finished" podID="692f1783-2d80-48a7-af1b-58a1f3f99315" containerID="a09bb661632ae0edd2e2be2bbaaf640ad99daf86961d4f77ca5c520617eeae7b" exitCode=1 Dec 03 19:56:24.122473 master-0 kubenswrapper[9368]: I1203 19:56:24.122051 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"692f1783-2d80-48a7-af1b-58a1f3f99315","Type":"ContainerDied","Data":"a09bb661632ae0edd2e2be2bbaaf640ad99daf86961d4f77ca5c520617eeae7b"} Dec 03 19:56:24.663363 master-0 kubenswrapper[9368]: I1203 19:56:24.663302 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 19:56:24.664320 master-0 kubenswrapper[9368]: I1203 19:56:24.664162 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.784022 master-0 kubenswrapper[9368]: I1203 19:56:24.783709 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 19:56:24.796637 master-0 kubenswrapper[9368]: I1203 19:56:24.796506 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.796637 master-0 kubenswrapper[9368]: I1203 19:56:24.796607 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.806031 master-0 kubenswrapper[9368]: I1203 19:56:24.805977 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.894929 master-0 kubenswrapper[9368]: I1203 19:56:24.894861 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz"] Dec 03 19:56:24.903806 master-0 kubenswrapper[9368]: I1203 19:56:24.895930 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:24.903806 master-0 kubenswrapper[9368]: I1203 19:56:24.901293 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 19:56:24.903806 master-0 kubenswrapper[9368]: I1203 19:56:24.901478 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 19:56:24.903806 master-0 kubenswrapper[9368]: I1203 19:56:24.901564 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 19:56:24.903806 master-0 kubenswrapper[9368]: I1203 19:56:24.901670 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.907886 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.907931 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.907969 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908010 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908030 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908045 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908080 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2tt\" (UniqueName: \"kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908249 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908409 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:24.909792 master-0 kubenswrapper[9368]: I1203 19:56:24.908599 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 19:56:24.951816 master-0 kubenswrapper[9368]: I1203 19:56:24.951414 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:25.011478 master-0 kubenswrapper[9368]: I1203 19:56:25.011421 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.011695 master-0 kubenswrapper[9368]: I1203 19:56:25.011655 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.011745 master-0 kubenswrapper[9368]: I1203 19:56:25.011715 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.011796 master-0 kubenswrapper[9368]: I1203 19:56:25.011757 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2tt\" (UniqueName: \"kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.016811 master-0 kubenswrapper[9368]: I1203 19:56:25.012640 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.016811 master-0 kubenswrapper[9368]: I1203 19:56:25.012742 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:56:25.016811 master-0 kubenswrapper[9368]: I1203 19:56:25.015353 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.016811 master-0 kubenswrapper[9368]: I1203 19:56:25.016288 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.069871 master-0 kubenswrapper[9368]: I1203 19:56:25.069664 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2tt\" (UniqueName: \"kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt\") pod \"machine-approver-5775bfbf6d-psrtz\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.132067 master-0 kubenswrapper[9368]: I1203 19:56:25.132027 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"93a5a792-4066-4863-a409-aaeb1b6ac193","Type":"ContainerStarted","Data":"d3ac5f831813ac492cbd45967896e8bddaee7f43a6c7d3b45ca773f53029fc6b"} Dec 03 19:56:25.134080 master-0 kubenswrapper[9368]: I1203 19:56:25.133735 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" event={"ID":"e727b97c-263b-430a-8502-106236863710","Type":"ContainerStarted","Data":"c2527388466746c6c4766bb74fdac1bb1cb6d4de2564764a7133c319d838a12f"} Dec 03 19:56:25.245066 master-0 kubenswrapper[9368]: I1203 19:56:25.245026 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:56:25.705613 master-0 kubenswrapper[9368]: I1203 19:56:25.705504 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr"] Dec 03 19:56:25.706720 master-0 kubenswrapper[9368]: I1203 19:56:25.706675 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.714264 master-0 kubenswrapper[9368]: I1203 19:56:25.708990 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 19:56:25.714264 master-0 kubenswrapper[9368]: I1203 19:56:25.709842 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 19:56:25.714264 master-0 kubenswrapper[9368]: I1203 19:56:25.710247 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 19:56:25.718088 master-0 kubenswrapper[9368]: I1203 19:56:25.718026 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 19:56:25.742291 master-0 kubenswrapper[9368]: I1203 19:56:25.742225 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr"] Dec 03 19:56:25.826831 master-0 kubenswrapper[9368]: I1203 19:56:25.820275 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.826831 master-0 kubenswrapper[9368]: I1203 19:56:25.820330 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d468\" (UniqueName: \"kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.826831 master-0 kubenswrapper[9368]: I1203 19:56:25.820351 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.923239 master-0 kubenswrapper[9368]: I1203 19:56:25.922387 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d468\" (UniqueName: \"kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.923239 master-0 kubenswrapper[9368]: I1203 19:56:25.923226 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.923591 master-0 kubenswrapper[9368]: I1203 19:56:25.923404 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.926028 master-0 kubenswrapper[9368]: I1203 19:56:25.925967 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.930806 master-0 kubenswrapper[9368]: I1203 19:56:25.929755 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:25.944582 master-0 kubenswrapper[9368]: I1203 19:56:25.944529 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d468\" (UniqueName: \"kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:26.030295 master-0 kubenswrapper[9368]: I1203 19:56:26.028099 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 19:56:26.044230 master-0 kubenswrapper[9368]: I1203 19:56:26.044157 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv"] Dec 03 19:56:26.045146 master-0 kubenswrapper[9368]: I1203 19:56:26.045111 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.046748 master-0 kubenswrapper[9368]: I1203 19:56:26.046699 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 19:56:26.049186 master-0 kubenswrapper[9368]: I1203 19:56:26.049145 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 19:56:26.049744 master-0 kubenswrapper[9368]: I1203 19:56:26.049704 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 19:56:26.054657 master-0 kubenswrapper[9368]: I1203 19:56:26.054601 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv"] Dec 03 19:56:26.125859 master-0 kubenswrapper[9368]: I1203 19:56:26.125218 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5h7\" (UniqueName: \"kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.125859 master-0 kubenswrapper[9368]: I1203 19:56:26.125298 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.226868 master-0 kubenswrapper[9368]: I1203 19:56:26.226381 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5h7\" (UniqueName: \"kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.227654 master-0 kubenswrapper[9368]: I1203 19:56:26.226895 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.243633 master-0 kubenswrapper[9368]: I1203 19:56:26.243591 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.252075 master-0 kubenswrapper[9368]: I1203 19:56:26.252036 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5h7\" (UniqueName: \"kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.275643 master-0 kubenswrapper[9368]: W1203 19:56:26.275568 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod186cc14f_5f58_43ca_8ffa_db07606ff0f7.slice/crio-b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782 WatchSource:0}: Error finding container b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782: Status 404 returned error can't find the container with id b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782 Dec 03 19:56:26.342979 master-0 kubenswrapper[9368]: I1203 19:56:26.342942 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb"] Dec 03 19:56:26.343666 master-0 kubenswrapper[9368]: I1203 19:56:26.343648 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.346558 master-0 kubenswrapper[9368]: I1203 19:56:26.346524 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 19:56:26.346654 master-0 kubenswrapper[9368]: I1203 19:56:26.346616 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 19:56:26.346967 master-0 kubenswrapper[9368]: I1203 19:56:26.346942 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 19:56:26.347625 master-0 kubenswrapper[9368]: I1203 19:56:26.347171 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 19:56:26.378825 master-0 kubenswrapper[9368]: I1203 19:56:26.356908 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb"] Dec 03 19:56:26.378825 master-0 kubenswrapper[9368]: I1203 19:56:26.364502 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 19:56:26.428957 master-0 kubenswrapper[9368]: I1203 19:56:26.428925 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.428957 master-0 kubenswrapper[9368]: I1203 19:56:26.428964 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.429114 master-0 kubenswrapper[9368]: I1203 19:56:26.428989 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.429114 master-0 kubenswrapper[9368]: I1203 19:56:26.429046 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcj7\" (UniqueName: \"kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.429114 master-0 kubenswrapper[9368]: I1203 19:56:26.429071 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.530046 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcj7\" (UniqueName: \"kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.530094 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.530120 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.530143 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.530160 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.531331 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.532032 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.534477 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.540866 master-0 kubenswrapper[9368]: I1203 19:56:26.535058 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.551026 master-0 kubenswrapper[9368]: I1203 19:56:26.550977 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcj7\" (UniqueName: \"kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:26.729836 master-0 kubenswrapper[9368]: I1203 19:56:26.725423 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 19:56:27.145639 master-0 kubenswrapper[9368]: I1203 19:56:27.145502 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"186cc14f-5f58-43ca-8ffa-db07606ff0f7","Type":"ContainerStarted","Data":"b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782"} Dec 03 19:56:27.901317 master-0 kubenswrapper[9368]: I1203 19:56:27.901252 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_692f1783-2d80-48a7-af1b-58a1f3f99315/installer/0.log" Dec 03 19:56:27.902042 master-0 kubenswrapper[9368]: I1203 19:56:27.901369 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:56:27.951076 master-0 kubenswrapper[9368]: I1203 19:56:27.950965 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock\") pod \"692f1783-2d80-48a7-af1b-58a1f3f99315\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " Dec 03 19:56:27.951314 master-0 kubenswrapper[9368]: I1203 19:56:27.951151 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir\") pod \"692f1783-2d80-48a7-af1b-58a1f3f99315\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " Dec 03 19:56:27.951871 master-0 kubenswrapper[9368]: I1203 19:56:27.951595 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access\") pod \"692f1783-2d80-48a7-af1b-58a1f3f99315\" (UID: \"692f1783-2d80-48a7-af1b-58a1f3f99315\") " Dec 03 19:56:27.954085 master-0 kubenswrapper[9368]: I1203 19:56:27.953146 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock" (OuterVolumeSpecName: "var-lock") pod "692f1783-2d80-48a7-af1b-58a1f3f99315" (UID: "692f1783-2d80-48a7-af1b-58a1f3f99315"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:27.954085 master-0 kubenswrapper[9368]: I1203 19:56:27.953224 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "692f1783-2d80-48a7-af1b-58a1f3f99315" (UID: "692f1783-2d80-48a7-af1b-58a1f3f99315"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:27.955682 master-0 kubenswrapper[9368]: I1203 19:56:27.955624 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "692f1783-2d80-48a7-af1b-58a1f3f99315" (UID: "692f1783-2d80-48a7-af1b-58a1f3f99315"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:28.020678 master-0 kubenswrapper[9368]: I1203 19:56:28.020160 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx"] Dec 03 19:56:28.020678 master-0 kubenswrapper[9368]: E1203 19:56:28.020492 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692f1783-2d80-48a7-af1b-58a1f3f99315" containerName="installer" Dec 03 19:56:28.020678 master-0 kubenswrapper[9368]: I1203 19:56:28.020511 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="692f1783-2d80-48a7-af1b-58a1f3f99315" containerName="installer" Dec 03 19:56:28.020678 master-0 kubenswrapper[9368]: I1203 19:56:28.020685 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="692f1783-2d80-48a7-af1b-58a1f3f99315" containerName="installer" Dec 03 19:56:28.025893 master-0 kubenswrapper[9368]: I1203 19:56:28.022490 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.025893 master-0 kubenswrapper[9368]: I1203 19:56:28.023987 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 19:56:28.026201 master-0 kubenswrapper[9368]: I1203 19:56:28.026169 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 19:56:28.026242 master-0 kubenswrapper[9368]: I1203 19:56:28.026201 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 19:56:28.026371 master-0 kubenswrapper[9368]: I1203 19:56:28.026328 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 19:56:28.026541 master-0 kubenswrapper[9368]: I1203 19:56:28.026461 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 19:56:28.034813 master-0 kubenswrapper[9368]: I1203 19:56:28.034726 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx"] Dec 03 19:56:28.053730 master-0 kubenswrapper[9368]: I1203 19:56:28.053672 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.053811 master-0 kubenswrapper[9368]: I1203 19:56:28.053791 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzd2g\" (UniqueName: \"kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.053863 master-0 kubenswrapper[9368]: I1203 19:56:28.053833 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.053951 master-0 kubenswrapper[9368]: I1203 19:56:28.053924 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.054063 master-0 kubenswrapper[9368]: I1203 19:56:28.054013 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/692f1783-2d80-48a7-af1b-58a1f3f99315-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:28.054102 master-0 kubenswrapper[9368]: I1203 19:56:28.054061 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:28.054102 master-0 kubenswrapper[9368]: I1203 19:56:28.054072 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/692f1783-2d80-48a7-af1b-58a1f3f99315-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:28.080511 master-0 kubenswrapper[9368]: I1203 19:56:28.080318 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4"] Dec 03 19:56:28.081264 master-0 kubenswrapper[9368]: I1203 19:56:28.081232 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.084099 master-0 kubenswrapper[9368]: I1203 19:56:28.084038 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 19:56:28.084512 master-0 kubenswrapper[9368]: I1203 19:56:28.084484 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 19:56:28.096984 master-0 kubenswrapper[9368]: I1203 19:56:28.096943 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4"] Dec 03 19:56:28.136042 master-0 kubenswrapper[9368]: I1203 19:56:28.135977 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:28.155609 master-0 kubenswrapper[9368]: I1203 19:56:28.155541 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.155858 master-0 kubenswrapper[9368]: I1203 19:56:28.155620 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.155858 master-0 kubenswrapper[9368]: I1203 19:56:28.155676 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzd2g\" (UniqueName: \"kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.155858 master-0 kubenswrapper[9368]: I1203 19:56:28.155699 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.157113 master-0 kubenswrapper[9368]: I1203 19:56:28.157083 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.157229 master-0 kubenswrapper[9368]: E1203 19:56:28.157202 9368 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: configmap "kube-rbac-proxy" not found Dec 03 19:56:28.157282 master-0 kubenswrapper[9368]: E1203 19:56:28.157253 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config podName:8dbbb6f8-711c-49a0-bc36-fa5d50124bd8 nodeName:}" failed. No retries permitted until 2025-12-03 19:56:28.657238442 +0000 UTC m=+54.318488353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config") pod "machine-config-operator-664c9d94c9-lt6dx" (UID: "8dbbb6f8-711c-49a0-bc36-fa5d50124bd8") : configmap "kube-rbac-proxy" not found Dec 03 19:56:28.165752 master-0 kubenswrapper[9368]: I1203 19:56:28.163908 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.171916 master-0 kubenswrapper[9368]: I1203 19:56:28.171883 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_692f1783-2d80-48a7-af1b-58a1f3f99315/installer/0.log" Dec 03 19:56:28.172045 master-0 kubenswrapper[9368]: I1203 19:56:28.171954 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"692f1783-2d80-48a7-af1b-58a1f3f99315","Type":"ContainerDied","Data":"f126dafdd5aa72693eb55c4dcedd323ed5a556c45b737d2405e2a32737d0b414"} Dec 03 19:56:28.172045 master-0 kubenswrapper[9368]: I1203 19:56:28.172014 9368 scope.go:117] "RemoveContainer" containerID="a09bb661632ae0edd2e2be2bbaaf640ad99daf86961d4f77ca5c520617eeae7b" Dec 03 19:56:28.172218 master-0 kubenswrapper[9368]: I1203 19:56:28.172176 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 19:56:28.176878 master-0 kubenswrapper[9368]: I1203 19:56:28.176030 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p"] Dec 03 19:56:28.176878 master-0 kubenswrapper[9368]: I1203 19:56:28.176644 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.178286 master-0 kubenswrapper[9368]: I1203 19:56:28.178245 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzd2g\" (UniqueName: \"kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.178542 master-0 kubenswrapper[9368]: I1203 19:56:28.178514 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 19:56:28.191621 master-0 kubenswrapper[9368]: I1203 19:56:28.191573 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p"] Dec 03 19:56:28.234737 master-0 kubenswrapper[9368]: I1203 19:56:28.234665 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:56:28.238150 master-0 kubenswrapper[9368]: I1203 19:56:28.238103 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 19:56:28.257388 master-0 kubenswrapper[9368]: I1203 19:56:28.257329 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.257388 master-0 kubenswrapper[9368]: I1203 19:56:28.257384 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.257603 master-0 kubenswrapper[9368]: I1203 19:56:28.257458 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6skg\" (UniqueName: \"kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.307574 master-0 kubenswrapper[9368]: I1203 19:56:28.307432 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-h64kt"] Dec 03 19:56:28.308319 master-0 kubenswrapper[9368]: I1203 19:56:28.308294 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.310786 master-0 kubenswrapper[9368]: I1203 19:56:28.310734 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 19:56:28.310954 master-0 kubenswrapper[9368]: I1203 19:56:28.310931 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 19:56:28.311092 master-0 kubenswrapper[9368]: I1203 19:56:28.311072 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 19:56:28.319681 master-0 kubenswrapper[9368]: I1203 19:56:28.319631 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-h64kt"] Dec 03 19:56:28.321821 master-0 kubenswrapper[9368]: I1203 19:56:28.321752 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 19:56:28.324145 master-0 kubenswrapper[9368]: I1203 19:56:28.324112 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 19:56:28.359208 master-0 kubenswrapper[9368]: I1203 19:56:28.359103 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.359208 master-0 kubenswrapper[9368]: I1203 19:56:28.359165 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.359463 master-0 kubenswrapper[9368]: I1203 19:56:28.359231 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6skg\" (UniqueName: \"kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.359463 master-0 kubenswrapper[9368]: I1203 19:56:28.359290 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvklf\" (UniqueName: \"kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.359463 master-0 kubenswrapper[9368]: I1203 19:56:28.359317 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.360360 master-0 kubenswrapper[9368]: I1203 19:56:28.360331 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.364709 master-0 kubenswrapper[9368]: I1203 19:56:28.364664 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.375274 master-0 kubenswrapper[9368]: I1203 19:56:28.375217 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6skg\" (UniqueName: \"kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.414175 master-0 kubenswrapper[9368]: I1203 19:56:28.414120 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 19:56:28.460470 master-0 kubenswrapper[9368]: I1203 19:56:28.460414 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.460470 master-0 kubenswrapper[9368]: I1203 19:56:28.460462 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.461298 master-0 kubenswrapper[9368]: I1203 19:56:28.461270 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdd6z\" (UniqueName: \"kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.461424 master-0 kubenswrapper[9368]: I1203 19:56:28.461405 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvklf\" (UniqueName: \"kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.461456 master-0 kubenswrapper[9368]: I1203 19:56:28.461434 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.461488 master-0 kubenswrapper[9368]: I1203 19:56:28.461460 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.461515 master-0 kubenswrapper[9368]: I1203 19:56:28.461487 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.465738 master-0 kubenswrapper[9368]: I1203 19:56:28.465694 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.476590 master-0 kubenswrapper[9368]: I1203 19:56:28.476538 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvklf\" (UniqueName: \"kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.523721 master-0 kubenswrapper[9368]: I1203 19:56:28.523664 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 19:56:28.562453 master-0 kubenswrapper[9368]: I1203 19:56:28.562381 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.562651 master-0 kubenswrapper[9368]: I1203 19:56:28.562463 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.562651 master-0 kubenswrapper[9368]: I1203 19:56:28.562546 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdd6z\" (UniqueName: \"kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.562651 master-0 kubenswrapper[9368]: I1203 19:56:28.562604 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.562793 master-0 kubenswrapper[9368]: I1203 19:56:28.562666 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.563344 master-0 kubenswrapper[9368]: I1203 19:56:28.563300 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.563708 master-0 kubenswrapper[9368]: I1203 19:56:28.563677 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.564862 master-0 kubenswrapper[9368]: I1203 19:56:28.564814 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.567322 master-0 kubenswrapper[9368]: I1203 19:56:28.567282 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.567717 master-0 kubenswrapper[9368]: I1203 19:56:28.567677 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692f1783-2d80-48a7-af1b-58a1f3f99315" path="/var/lib/kubelet/pods/692f1783-2d80-48a7-af1b-58a1f3f99315/volumes" Dec 03 19:56:28.582810 master-0 kubenswrapper[9368]: I1203 19:56:28.582736 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdd6z\" (UniqueName: \"kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.623183 master-0 kubenswrapper[9368]: I1203 19:56:28.623067 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 19:56:28.625324 master-0 kubenswrapper[9368]: I1203 19:56:28.624392 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:28.663482 master-0 kubenswrapper[9368]: I1203 19:56:28.663429 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.664108 master-0 kubenswrapper[9368]: I1203 19:56:28.664068 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:28.675716 master-0 kubenswrapper[9368]: I1203 19:56:28.675320 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 19:56:36.234701 master-0 kubenswrapper[9368]: I1203 19:56:36.234518 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 19:56:36.235261 master-0 kubenswrapper[9368]: I1203 19:56:36.235236 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.236593 master-0 kubenswrapper[9368]: I1203 19:56:36.236144 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 19:56:36.236718 master-0 kubenswrapper[9368]: I1203 19:56:36.236702 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.339446 master-0 kubenswrapper[9368]: I1203 19:56:36.337166 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 19:56:36.339446 master-0 kubenswrapper[9368]: I1203 19:56:36.339190 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 19:56:36.340981 master-0 kubenswrapper[9368]: I1203 19:56:36.340951 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg"] Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373279 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373388 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373416 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373457 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373487 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.374897 master-0 kubenswrapper[9368]: I1203 19:56:36.373528 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475189 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475251 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475281 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475301 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475319 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475365 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.475582 master-0 kubenswrapper[9368]: I1203 19:56:36.475438 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.475977 master-0 kubenswrapper[9368]: I1203 19:56:36.475716 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475977 master-0 kubenswrapper[9368]: I1203 19:56:36.475865 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.475977 master-0 kubenswrapper[9368]: I1203 19:56:36.475891 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.501417 master-0 kubenswrapper[9368]: I1203 19:56:36.501322 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.520831 master-0 kubenswrapper[9368]: I1203 19:56:36.518606 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.538955 master-0 kubenswrapper[9368]: I1203 19:56:36.537715 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99"] Dec 03 19:56:36.538955 master-0 kubenswrapper[9368]: I1203 19:56:36.538637 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.542435 master-0 kubenswrapper[9368]: I1203 19:56:36.542399 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 19:56:36.549479 master-0 kubenswrapper[9368]: I1203 19:56:36.549430 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 19:56:36.549479 master-0 kubenswrapper[9368]: I1203 19:56:36.549457 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 19:56:36.549653 master-0 kubenswrapper[9368]: I1203 19:56:36.549473 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 19:56:36.549653 master-0 kubenswrapper[9368]: I1203 19:56:36.549624 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 19:56:36.560141 master-0 kubenswrapper[9368]: I1203 19:56:36.560100 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:56:36.592929 master-0 kubenswrapper[9368]: I1203 19:56:36.589419 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:36.592929 master-0 kubenswrapper[9368]: I1203 19:56:36.589765 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 19:56:36.603875 master-0 kubenswrapper[9368]: I1203 19:56:36.603833 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 19:56:36.678798 master-0 kubenswrapper[9368]: I1203 19:56:36.677807 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.678798 master-0 kubenswrapper[9368]: I1203 19:56:36.677847 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgjkp\" (UniqueName: \"kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.678798 master-0 kubenswrapper[9368]: I1203 19:56:36.677894 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.678798 master-0 kubenswrapper[9368]: I1203 19:56:36.677968 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.678798 master-0 kubenswrapper[9368]: I1203 19:56:36.677987 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.779527 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.779578 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgjkp\" (UniqueName: \"kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.779623 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.779668 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.779684 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.781241 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.781366 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.781997 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.789857 master-0 kubenswrapper[9368]: I1203 19:56:36.783520 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.795927 master-0 kubenswrapper[9368]: I1203 19:56:36.792253 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-9p9rq"] Dec 03 19:56:36.795927 master-0 kubenswrapper[9368]: I1203 19:56:36.793048 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.795927 master-0 kubenswrapper[9368]: I1203 19:56:36.794797 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 19:56:36.795927 master-0 kubenswrapper[9368]: I1203 19:56:36.795119 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 19:56:36.795927 master-0 kubenswrapper[9368]: I1203 19:56:36.795300 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 19:56:36.810142 master-0 kubenswrapper[9368]: I1203 19:56:36.810082 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-9p9rq"] Dec 03 19:56:36.822564 master-0 kubenswrapper[9368]: I1203 19:56:36.812368 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgjkp\" (UniqueName: \"kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-npd99\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.870925 master-0 kubenswrapper[9368]: I1203 19:56:36.866237 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch"] Dec 03 19:56:36.870925 master-0 kubenswrapper[9368]: I1203 19:56:36.866933 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:36.870925 master-0 kubenswrapper[9368]: I1203 19:56:36.868797 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 19:56:36.871142 master-0 kubenswrapper[9368]: I1203 19:56:36.870984 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:56:36.880951 master-0 kubenswrapper[9368]: I1203 19:56:36.880902 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.881078 master-0 kubenswrapper[9368]: I1203 19:56:36.880982 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.881078 master-0 kubenswrapper[9368]: I1203 19:56:36.881009 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.881078 master-0 kubenswrapper[9368]: I1203 19:56:36.881029 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjn9m\" (UniqueName: \"kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.887353 master-0 kubenswrapper[9368]: I1203 19:56:36.887193 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch"] Dec 03 19:56:36.981839 master-0 kubenswrapper[9368]: I1203 19:56:36.981746 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.982020 master-0 kubenswrapper[9368]: I1203 19:56:36.981845 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:36.982020 master-0 kubenswrapper[9368]: I1203 19:56:36.981884 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.982020 master-0 kubenswrapper[9368]: I1203 19:56:36.981910 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:36.982020 master-0 kubenswrapper[9368]: I1203 19:56:36.981940 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjn9m\" (UniqueName: \"kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.982020 master-0 kubenswrapper[9368]: I1203 19:56:36.981977 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:36.982170 master-0 kubenswrapper[9368]: I1203 19:56:36.982021 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7t26\" (UniqueName: \"kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:36.982170 master-0 kubenswrapper[9368]: I1203 19:56:36.982048 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.982400 master-0 kubenswrapper[9368]: E1203 19:56:36.982364 9368 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: secret "machine-api-operator-tls" not found Dec 03 19:56:36.982437 master-0 kubenswrapper[9368]: E1203 19:56:36.982427 9368 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls podName:ad22d8ed-2476-441b-aa3b-a7845606b0ac nodeName:}" failed. No retries permitted until 2025-12-03 19:56:37.482410842 +0000 UTC m=+63.143660753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls") pod "machine-api-operator-7486ff55f-9p9rq" (UID: "ad22d8ed-2476-441b-aa3b-a7845606b0ac") : secret "machine-api-operator-tls" not found Dec 03 19:56:36.983015 master-0 kubenswrapper[9368]: I1203 19:56:36.982983 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:36.985680 master-0 kubenswrapper[9368]: I1203 19:56:36.984106 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:37.001747 master-0 kubenswrapper[9368]: I1203 19:56:37.001691 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjn9m\" (UniqueName: \"kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:37.083921 master-0 kubenswrapper[9368]: I1203 19:56:37.083811 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.084058 master-0 kubenswrapper[9368]: I1203 19:56:37.083949 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7t26\" (UniqueName: \"kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.084097 master-0 kubenswrapper[9368]: I1203 19:56:37.084060 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.084129 master-0 kubenswrapper[9368]: I1203 19:56:37.084092 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.084664 master-0 kubenswrapper[9368]: I1203 19:56:37.084632 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.089078 master-0 kubenswrapper[9368]: I1203 19:56:37.089040 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.090128 master-0 kubenswrapper[9368]: I1203 19:56:37.090091 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.114745 master-0 kubenswrapper[9368]: I1203 19:56:37.114668 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7t26\" (UniqueName: \"kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.192453 master-0 kubenswrapper[9368]: I1203 19:56:37.192384 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:37.501165 master-0 kubenswrapper[9368]: I1203 19:56:37.499403 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:37.528594 master-0 kubenswrapper[9368]: I1203 19:56:37.528548 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:37.758622 master-0 kubenswrapper[9368]: I1203 19:56:37.754529 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 19:56:37.802253 master-0 kubenswrapper[9368]: I1203 19:56:37.802075 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:37.866077 master-0 kubenswrapper[9368]: I1203 19:56:37.865923 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:38.030438 master-0 kubenswrapper[9368]: I1203 19:56:38.030408 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 19:56:38.039156 master-0 kubenswrapper[9368]: I1203 19:56:38.037920 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv"] Dec 03 19:56:38.050865 master-0 kubenswrapper[9368]: I1203 19:56:38.048471 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 19:56:38.053077 master-0 kubenswrapper[9368]: I1203 19:56:38.052434 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr"] Dec 03 19:56:38.234174 master-0 kubenswrapper[9368]: I1203 19:56:38.233570 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"33ba32c7-9e77-419f-b417-8f1aa28ecd5d","Type":"ContainerStarted","Data":"764602bbdcf44f32b31fcf6b225c7cc11878e8ab080298308eed45bd7554c4ed"} Dec 03 19:56:38.262042 master-0 kubenswrapper[9368]: I1203 19:56:38.244531 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerStarted","Data":"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9"} Dec 03 19:56:38.262042 master-0 kubenswrapper[9368]: I1203 19:56:38.249844 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" event={"ID":"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2","Type":"ContainerStarted","Data":"1cd08c33a38d123c20d17a144cb73cdc913867f657f3ed47969c25f2ac5811c9"} Dec 03 19:56:38.291059 master-0 kubenswrapper[9368]: I1203 19:56:38.291006 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb"] Dec 03 19:56:38.299833 master-0 kubenswrapper[9368]: I1203 19:56:38.299766 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerStarted","Data":"1a9171c75fb14718020761f1f71ba22ead121950532b19c76cab72343f2fbba6"} Dec 03 19:56:38.300221 master-0 kubenswrapper[9368]: I1203 19:56:38.300203 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p"] Dec 03 19:56:38.318743 master-0 kubenswrapper[9368]: I1203 19:56:38.316522 9368 generic.go:334] "Generic (PLEG): container finished" podID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerID="56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2" exitCode=0 Dec 03 19:56:38.319344 master-0 kubenswrapper[9368]: I1203 19:56:38.319212 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="93a5a792-4066-4863-a409-aaeb1b6ac193" containerName="installer" containerID="cri-o://66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a" gracePeriod=30 Dec 03 19:56:38.323444 master-0 kubenswrapper[9368]: I1203 19:56:38.312345 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4"] Dec 03 19:56:38.323444 master-0 kubenswrapper[9368]: I1203 19:56:38.323402 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx"] Dec 03 19:56:38.324215 master-0 kubenswrapper[9368]: I1203 19:56:38.324115 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerDied","Data":"56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2"} Dec 03 19:56:38.324215 master-0 kubenswrapper[9368]: I1203 19:56:38.324183 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"93a5a792-4066-4863-a409-aaeb1b6ac193","Type":"ContainerStarted","Data":"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a"} Dec 03 19:56:38.324215 master-0 kubenswrapper[9368]: I1203 19:56:38.324203 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerStarted","Data":"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9"} Dec 03 19:56:38.325283 master-0 kubenswrapper[9368]: I1203 19:56:38.325164 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"186cc14f-5f58-43ca-8ffa-db07606ff0f7","Type":"ContainerStarted","Data":"5217957523f4b5166716d8ff3b268cfc1e054e38ab89fcd916d9adc0a629dce1"} Dec 03 19:56:38.326430 master-0 kubenswrapper[9368]: I1203 19:56:38.326383 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ce4afc7a-a338-4a2c-bada-22d4bac75d49","Type":"ContainerStarted","Data":"5c7672f753235f31861db5762e7805d7dbeffaa2c208518211750ae8f4c45f42"} Dec 03 19:56:38.327256 master-0 kubenswrapper[9368]: I1203 19:56:38.327210 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" event={"ID":"6404bbc7-8ca9-4f20-8ce7-40f855555160","Type":"ContainerStarted","Data":"5bee7d8031a36fa09960f186184717b2ac09e44e86995d183c886a9ab1dcdca8"} Dec 03 19:56:38.328179 master-0 kubenswrapper[9368]: I1203 19:56:38.328155 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" event={"ID":"cd35fc5f-07ab-4c66-9b80-33a598d417ef","Type":"ContainerStarted","Data":"0b9c573f7ba19dc3323c14093fb10a43f5d1d1f19bc23f8da28f974d65efe3f1"} Dec 03 19:56:38.332435 master-0 kubenswrapper[9368]: I1203 19:56:38.332377 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerStarted","Data":"f99c24374916ccefecbe6788346b4cb9fb3b6dbba7b45f5a9bea3621fcd4bafb"} Dec 03 19:56:38.333894 master-0 kubenswrapper[9368]: I1203 19:56:38.333821 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" event={"ID":"b5cad72f-5bbf-42fc-9d63-545a01c98cbe","Type":"ContainerStarted","Data":"c4dc9f4dd5e88018642a46232bff77d5e6ea06620de4db64db7e71c41383a65d"} Dec 03 19:56:38.335042 master-0 kubenswrapper[9368]: I1203 19:56:38.334986 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a454ceb3-7801-4bf0-82d3-112ab93b687b","Type":"ContainerStarted","Data":"22fb555625ad569a949864018978374315453d83ca6adc979cc16aef974813c6"} Dec 03 19:56:38.338101 master-0 kubenswrapper[9368]: I1203 19:56:38.338066 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerStarted","Data":"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee"} Dec 03 19:56:38.338221 master-0 kubenswrapper[9368]: I1203 19:56:38.338208 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerStarted","Data":"aea6f3a9262e629da79db6eb6db6e7fa4b11e6388c15a6c5cfabb34f955bd062"} Dec 03 19:56:38.345433 master-0 kubenswrapper[9368]: I1203 19:56:38.345385 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" event={"ID":"e727b97c-263b-430a-8502-106236863710","Type":"ContainerStarted","Data":"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c"} Dec 03 19:56:38.345741 master-0 kubenswrapper[9368]: I1203 19:56:38.345707 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" podUID="e727b97c-263b-430a-8502-106236863710" containerName="controller-manager" containerID="cri-o://b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c" gracePeriod=30 Dec 03 19:56:38.346397 master-0 kubenswrapper[9368]: I1203 19:56:38.346385 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:38.357352 master-0 kubenswrapper[9368]: I1203 19:56:38.356592 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=28.356576975 podStartE2EDuration="28.356576975s" podCreationTimestamp="2025-12-03 19:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:38.355535818 +0000 UTC m=+64.016785739" watchObservedRunningTime="2025-12-03 19:56:38.356576975 +0000 UTC m=+64.017826886" Dec 03 19:56:38.364473 master-0 kubenswrapper[9368]: I1203 19:56:38.363948 9368 patch_prober.go:28] interesting pod/controller-manager-68f766fc9-lwgcg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.48:8443/healthz\": read tcp 10.128.0.2:42836->10.128.0.48:8443: read: connection reset by peer" start-of-body= Dec 03 19:56:38.364473 master-0 kubenswrapper[9368]: I1203 19:56:38.364014 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" podUID="e727b97c-263b-430a-8502-106236863710" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.48:8443/healthz\": read tcp 10.128.0.2:42836->10.128.0.48:8443: read: connection reset by peer" Dec 03 19:56:38.422809 master-0 kubenswrapper[9368]: I1203 19:56:38.421937 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=27.421918715 podStartE2EDuration="27.421918715s" podCreationTimestamp="2025-12-03 19:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:38.421691059 +0000 UTC m=+64.082940970" watchObservedRunningTime="2025-12-03 19:56:38.421918715 +0000 UTC m=+64.083168626" Dec 03 19:56:38.443926 master-0 kubenswrapper[9368]: I1203 19:56:38.443064 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" podStartSLOduration=15.058405022 podStartE2EDuration="28.443043922s" podCreationTimestamp="2025-12-03 19:56:10 +0000 UTC" firstStartedPulling="2025-12-03 19:56:24.152668766 +0000 UTC m=+49.813918687" lastFinishedPulling="2025-12-03 19:56:37.537307676 +0000 UTC m=+63.198557587" observedRunningTime="2025-12-03 19:56:38.442734974 +0000 UTC m=+64.103984885" watchObservedRunningTime="2025-12-03 19:56:38.443043922 +0000 UTC m=+64.104293833" Dec 03 19:56:38.468830 master-0 kubenswrapper[9368]: W1203 19:56:38.468796 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2021db5_b27a_4e06_beec_d9ba82aa1ffc.slice/crio-b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3 WatchSource:0}: Error finding container b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3: Status 404 returned error can't find the container with id b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3 Dec 03 19:56:38.561610 master-0 kubenswrapper[9368]: W1203 19:56:38.561549 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dbbb6f8_711c_49a0_bc36_fa5d50124bd8.slice/crio-75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2 WatchSource:0}: Error finding container 75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2: Status 404 returned error can't find the container with id 75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2 Dec 03 19:56:38.647165 master-0 kubenswrapper[9368]: I1203 19:56:38.647121 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch"] Dec 03 19:56:38.662147 master-0 kubenswrapper[9368]: I1203 19:56:38.662069 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 19:56:38.666855 master-0 kubenswrapper[9368]: I1203 19:56:38.666679 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-9p9rq"] Dec 03 19:56:38.677153 master-0 kubenswrapper[9368]: I1203 19:56:38.677115 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-h64kt"] Dec 03 19:56:38.680367 master-0 kubenswrapper[9368]: I1203 19:56:38.680316 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 19:56:38.680562 master-0 kubenswrapper[9368]: W1203 19:56:38.680473 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9afe01c7_825c_43d1_8425_0317cdde11d6.slice/crio-d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158 WatchSource:0}: Error finding container d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158: Status 404 returned error can't find the container with id d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158 Dec 03 19:56:38.687828 master-0 kubenswrapper[9368]: W1203 19:56:38.687764 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2023e1_9c7a_40af_a6bf_fba31c3565b1.slice/crio-1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4 WatchSource:0}: Error finding container 1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4: Status 404 returned error can't find the container with id 1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4 Dec 03 19:56:38.697694 master-0 kubenswrapper[9368]: I1203 19:56:38.697558 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_93a5a792-4066-4863-a409-aaeb1b6ac193/installer/0.log" Dec 03 19:56:38.697694 master-0 kubenswrapper[9368]: I1203 19:56:38.697616 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:38.719242 master-0 kubenswrapper[9368]: W1203 19:56:38.716389 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbacd155a_fee3_4e5e_89a2_ab86f401d2ff.slice/crio-40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba WatchSource:0}: Error finding container 40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba: Status 404 returned error can't find the container with id 40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba Dec 03 19:56:38.747765 master-0 kubenswrapper[9368]: I1203 19:56:38.747582 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:38.820461 master-0 kubenswrapper[9368]: I1203 19:56:38.820419 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock\") pod \"93a5a792-4066-4863-a409-aaeb1b6ac193\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " Dec 03 19:56:38.820594 master-0 kubenswrapper[9368]: I1203 19:56:38.820557 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir\") pod \"93a5a792-4066-4863-a409-aaeb1b6ac193\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " Dec 03 19:56:38.820685 master-0 kubenswrapper[9368]: I1203 19:56:38.820547 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock" (OuterVolumeSpecName: "var-lock") pod "93a5a792-4066-4863-a409-aaeb1b6ac193" (UID: "93a5a792-4066-4863-a409-aaeb1b6ac193"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:38.820745 master-0 kubenswrapper[9368]: I1203 19:56:38.820584 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "93a5a792-4066-4863-a409-aaeb1b6ac193" (UID: "93a5a792-4066-4863-a409-aaeb1b6ac193"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:38.820745 master-0 kubenswrapper[9368]: I1203 19:56:38.820666 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access\") pod \"93a5a792-4066-4863-a409-aaeb1b6ac193\" (UID: \"93a5a792-4066-4863-a409-aaeb1b6ac193\") " Dec 03 19:56:38.821157 master-0 kubenswrapper[9368]: I1203 19:56:38.821131 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:38.821157 master-0 kubenswrapper[9368]: I1203 19:56:38.821151 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93a5a792-4066-4863-a409-aaeb1b6ac193-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:38.835562 master-0 kubenswrapper[9368]: I1203 19:56:38.835511 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "93a5a792-4066-4863-a409-aaeb1b6ac193" (UID: "93a5a792-4066-4863-a409-aaeb1b6ac193"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:38.922198 master-0 kubenswrapper[9368]: I1203 19:56:38.922128 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles\") pod \"e727b97c-263b-430a-8502-106236863710\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " Dec 03 19:56:38.922198 master-0 kubenswrapper[9368]: I1203 19:56:38.922199 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca\") pod \"e727b97c-263b-430a-8502-106236863710\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " Dec 03 19:56:38.922310 master-0 kubenswrapper[9368]: I1203 19:56:38.922227 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config\") pod \"e727b97c-263b-430a-8502-106236863710\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " Dec 03 19:56:38.922310 master-0 kubenswrapper[9368]: I1203 19:56:38.922263 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert\") pod \"e727b97c-263b-430a-8502-106236863710\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " Dec 03 19:56:38.922362 master-0 kubenswrapper[9368]: I1203 19:56:38.922317 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv\") pod \"e727b97c-263b-430a-8502-106236863710\" (UID: \"e727b97c-263b-430a-8502-106236863710\") " Dec 03 19:56:38.922558 master-0 kubenswrapper[9368]: I1203 19:56:38.922537 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93a5a792-4066-4863-a409-aaeb1b6ac193-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:38.923453 master-0 kubenswrapper[9368]: I1203 19:56:38.923425 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config" (OuterVolumeSpecName: "config") pod "e727b97c-263b-430a-8502-106236863710" (UID: "e727b97c-263b-430a-8502-106236863710"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:38.923503 master-0 kubenswrapper[9368]: I1203 19:56:38.923449 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca" (OuterVolumeSpecName: "client-ca") pod "e727b97c-263b-430a-8502-106236863710" (UID: "e727b97c-263b-430a-8502-106236863710"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:38.923503 master-0 kubenswrapper[9368]: I1203 19:56:38.923417 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e727b97c-263b-430a-8502-106236863710" (UID: "e727b97c-263b-430a-8502-106236863710"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:56:38.927395 master-0 kubenswrapper[9368]: I1203 19:56:38.927316 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e727b97c-263b-430a-8502-106236863710" (UID: "e727b97c-263b-430a-8502-106236863710"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:56:38.929746 master-0 kubenswrapper[9368]: I1203 19:56:38.929675 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv" (OuterVolumeSpecName: "kube-api-access-qnxkv") pod "e727b97c-263b-430a-8502-106236863710" (UID: "e727b97c-263b-430a-8502-106236863710"). InnerVolumeSpecName "kube-api-access-qnxkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:39.035802 master-0 kubenswrapper[9368]: I1203 19:56:39.032677 9368 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:39.035802 master-0 kubenswrapper[9368]: I1203 19:56:39.032713 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:39.035802 master-0 kubenswrapper[9368]: I1203 19:56:39.032722 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e727b97c-263b-430a-8502-106236863710-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:39.035802 master-0 kubenswrapper[9368]: I1203 19:56:39.032732 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/e727b97c-263b-430a-8502-106236863710-kube-api-access-qnxkv\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:39.035802 master-0 kubenswrapper[9368]: I1203 19:56:39.032742 9368 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e727b97c-263b-430a-8502-106236863710-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:39.368403 master-0 kubenswrapper[9368]: I1203 19:56:39.368359 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_93a5a792-4066-4863-a409-aaeb1b6ac193/installer/0.log" Dec 03 19:56:39.368504 master-0 kubenswrapper[9368]: I1203 19:56:39.368402 9368 generic.go:334] "Generic (PLEG): container finished" podID="93a5a792-4066-4863-a409-aaeb1b6ac193" containerID="66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a" exitCode=1 Dec 03 19:56:39.368504 master-0 kubenswrapper[9368]: I1203 19:56:39.368459 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"93a5a792-4066-4863-a409-aaeb1b6ac193","Type":"ContainerDied","Data":"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a"} Dec 03 19:56:39.368504 master-0 kubenswrapper[9368]: I1203 19:56:39.368493 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"93a5a792-4066-4863-a409-aaeb1b6ac193","Type":"ContainerDied","Data":"d3ac5f831813ac492cbd45967896e8bddaee7f43a6c7d3b45ca773f53029fc6b"} Dec 03 19:56:39.368617 master-0 kubenswrapper[9368]: I1203 19:56:39.368511 9368 scope.go:117] "RemoveContainer" containerID="66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a" Dec 03 19:56:39.368617 master-0 kubenswrapper[9368]: I1203 19:56:39.368609 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 19:56:39.373632 master-0 kubenswrapper[9368]: I1203 19:56:39.373353 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"9afe01c7-825c-43d1-8425-0317cdde11d6","Type":"ContainerStarted","Data":"7defd583f52b28f4c8a42f8533bc6a235b9b9753c15d53b3d581070bd6b239c4"} Dec 03 19:56:39.373632 master-0 kubenswrapper[9368]: I1203 19:56:39.373387 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"9afe01c7-825c-43d1-8425-0317cdde11d6","Type":"ContainerStarted","Data":"d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158"} Dec 03 19:56:39.376819 master-0 kubenswrapper[9368]: I1203 19:56:39.376767 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"4a1b2034d20b8550395063b65a0de0eddb16cb0c3a6fde052b4127e400052376"} Dec 03 19:56:39.379384 master-0 kubenswrapper[9368]: I1203 19:56:39.379345 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" event={"ID":"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8","Type":"ContainerStarted","Data":"d2d3d82713322ae1aff81c7992c27a7d2f9dc41cefcbe5ca67d3de17b3288924"} Dec 03 19:56:39.379448 master-0 kubenswrapper[9368]: I1203 19:56:39.379387 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" event={"ID":"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8","Type":"ContainerStarted","Data":"33fc3458349b78bc19c8b30395e299c49cdfbf37f7e541929fe27fba4fc59440"} Dec 03 19:56:39.379448 master-0 kubenswrapper[9368]: I1203 19:56:39.379400 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" event={"ID":"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8","Type":"ContainerStarted","Data":"75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2"} Dec 03 19:56:39.388576 master-0 kubenswrapper[9368]: I1203 19:56:39.387286 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a454ceb3-7801-4bf0-82d3-112ab93b687b/installer/0.log" Dec 03 19:56:39.388576 master-0 kubenswrapper[9368]: I1203 19:56:39.388228 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a454ceb3-7801-4bf0-82d3-112ab93b687b","Type":"ContainerStarted","Data":"53fcd11329bfb04c78021321046cdb8bf43022a895b0f92bf8ee67416f18fc5f"} Dec 03 19:56:39.389585 master-0 kubenswrapper[9368]: I1203 19:56:39.389546 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=7.389525302 podStartE2EDuration="7.389525302s" podCreationTimestamp="2025-12-03 19:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:39.387105591 +0000 UTC m=+65.048355502" watchObservedRunningTime="2025-12-03 19:56:39.389525302 +0000 UTC m=+65.050775213" Dec 03 19:56:39.402028 master-0 kubenswrapper[9368]: I1203 19:56:39.401887 9368 generic.go:334] "Generic (PLEG): container finished" podID="acb1d894-1bc0-478d-87fc-e9137291df70" containerID="206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9" exitCode=0 Dec 03 19:56:39.402641 master-0 kubenswrapper[9368]: I1203 19:56:39.402613 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerDied","Data":"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9"} Dec 03 19:56:39.407286 master-0 kubenswrapper[9368]: I1203 19:56:39.406988 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" event={"ID":"ad22d8ed-2476-441b-aa3b-a7845606b0ac","Type":"ContainerStarted","Data":"d025dc47eb91635263897472c3f550ef8be62955faec2c223afbf45ed47d08b1"} Dec 03 19:56:39.407286 master-0 kubenswrapper[9368]: I1203 19:56:39.407198 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" event={"ID":"ad22d8ed-2476-441b-aa3b-a7845606b0ac","Type":"ContainerStarted","Data":"85f7ddcd30f09f1a0fda67d2dbaf1344d49e468b4e45601d31e0dfb9ac188ad5"} Dec 03 19:56:39.410403 master-0 kubenswrapper[9368]: I1203 19:56:39.409275 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" podStartSLOduration=11.409261674 podStartE2EDuration="11.409261674s" podCreationTimestamp="2025-12-03 19:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:39.408423562 +0000 UTC m=+65.069673473" watchObservedRunningTime="2025-12-03 19:56:39.409261674 +0000 UTC m=+65.070511605" Dec 03 19:56:39.455826 master-0 kubenswrapper[9368]: I1203 19:56:39.450489 9368 scope.go:117] "RemoveContainer" containerID="66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a" Dec 03 19:56:39.468696 master-0 kubenswrapper[9368]: E1203 19:56:39.468653 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a\": container with ID starting with 66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a not found: ID does not exist" containerID="66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a" Dec 03 19:56:39.468885 master-0 kubenswrapper[9368]: I1203 19:56:39.468700 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a"} err="failed to get container status \"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a\": rpc error: code = NotFound desc = could not find container \"66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a\": container with ID starting with 66ccac620b2bd69ac712560775a97feae4a9715092b971307100f7bf0d79a06a not found: ID does not exist" Dec 03 19:56:39.469503 master-0 kubenswrapper[9368]: I1203 19:56:39.469481 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerDied","Data":"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9"} Dec 03 19:56:39.469831 master-0 kubenswrapper[9368]: I1203 19:56:39.468052 9368 generic.go:334] "Generic (PLEG): container finished" podID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerID="c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9" exitCode=0 Dec 03 19:56:39.484395 master-0 kubenswrapper[9368]: I1203 19:56:39.484359 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:39.492807 master-0 kubenswrapper[9368]: I1203 19:56:39.492753 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 19:56:39.498025 master-0 kubenswrapper[9368]: I1203 19:56:39.498002 9368 generic.go:334] "Generic (PLEG): container finished" podID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerID="1a9171c75fb14718020761f1f71ba22ead121950532b19c76cab72343f2fbba6" exitCode=0 Dec 03 19:56:39.498081 master-0 kubenswrapper[9368]: I1203 19:56:39.498067 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerDied","Data":"1a9171c75fb14718020761f1f71ba22ead121950532b19c76cab72343f2fbba6"} Dec 03 19:56:39.500366 master-0 kubenswrapper[9368]: I1203 19:56:39.500343 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"bacd155a-fee3-4e5e-89a2-ab86f401d2ff","Type":"ContainerStarted","Data":"82116db57e57089f2a0aaaa865b4d91e3469d2022d11777a1eb493f1bba12223"} Dec 03 19:56:39.500410 master-0 kubenswrapper[9368]: I1203 19:56:39.500369 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"bacd155a-fee3-4e5e-89a2-ab86f401d2ff","Type":"ContainerStarted","Data":"40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba"} Dec 03 19:56:39.501840 master-0 kubenswrapper[9368]: I1203 19:56:39.501813 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ce4afc7a-a338-4a2c-bada-22d4bac75d49","Type":"ContainerStarted","Data":"6734488c6ce6905e5e770b668e83066dd3b8267a0d3cf0d97567edcd50a10461"} Dec 03 19:56:39.508153 master-0 kubenswrapper[9368]: I1203 19:56:39.508129 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"3cff086346a7c5e3777cf149e0e1d8f97d1a0c5b1f9e52848dc132dcdccf253d"} Dec 03 19:56:39.512617 master-0 kubenswrapper[9368]: I1203 19:56:39.512586 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_33ba32c7-9e77-419f-b417-8f1aa28ecd5d/installer/0.log" Dec 03 19:56:39.512688 master-0 kubenswrapper[9368]: I1203 19:56:39.512620 9368 generic.go:334] "Generic (PLEG): container finished" podID="33ba32c7-9e77-419f-b417-8f1aa28ecd5d" containerID="31625955464d6bd81373d34abab51a1683cf14b5575194ba5d77e88634d8044d" exitCode=1 Dec 03 19:56:39.512688 master-0 kubenswrapper[9368]: I1203 19:56:39.512636 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"33ba32c7-9e77-419f-b417-8f1aa28ecd5d","Type":"ContainerDied","Data":"31625955464d6bd81373d34abab51a1683cf14b5575194ba5d77e88634d8044d"} Dec 03 19:56:39.515927 master-0 kubenswrapper[9368]: I1203 19:56:39.515900 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" event={"ID":"6404bbc7-8ca9-4f20-8ce7-40f855555160","Type":"ContainerStarted","Data":"95dc699db5e81ae88f29c91ef83348bafd60904436ba3ebce1844f19bbeefcbf"} Dec 03 19:56:39.532694 master-0 kubenswrapper[9368]: I1203 19:56:39.532626 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=16.532612576 podStartE2EDuration="16.532612576s" podCreationTimestamp="2025-12-03 19:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:39.530158574 +0000 UTC m=+65.191408485" watchObservedRunningTime="2025-12-03 19:56:39.532612576 +0000 UTC m=+65.193862487" Dec 03 19:56:39.542794 master-0 kubenswrapper[9368]: I1203 19:56:39.539272 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" event={"ID":"b2021db5-b27a-4e06-beec-d9ba82aa1ffc","Type":"ContainerStarted","Data":"1e33b64ebd7aca7528bb9b0bc12add6ddb2167e58a0055cf6eaa222df156d011"} Dec 03 19:56:39.542794 master-0 kubenswrapper[9368]: I1203 19:56:39.539309 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" event={"ID":"b2021db5-b27a-4e06-beec-d9ba82aa1ffc","Type":"ContainerStarted","Data":"b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3"} Dec 03 19:56:39.542794 master-0 kubenswrapper[9368]: I1203 19:56:39.541985 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4"} Dec 03 19:56:39.547830 master-0 kubenswrapper[9368]: I1203 19:56:39.543183 9368 generic.go:334] "Generic (PLEG): container finished" podID="e727b97c-263b-430a-8502-106236863710" containerID="b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c" exitCode=0 Dec 03 19:56:39.547830 master-0 kubenswrapper[9368]: I1203 19:56:39.543270 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" Dec 03 19:56:39.550718 master-0 kubenswrapper[9368]: I1203 19:56:39.550453 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" event={"ID":"e727b97c-263b-430a-8502-106236863710","Type":"ContainerDied","Data":"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c"} Dec 03 19:56:39.550718 master-0 kubenswrapper[9368]: I1203 19:56:39.550510 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f766fc9-lwgcg" event={"ID":"e727b97c-263b-430a-8502-106236863710","Type":"ContainerDied","Data":"c2527388466746c6c4766bb74fdac1bb1cb6d4de2564764a7133c319d838a12f"} Dec 03 19:56:39.550718 master-0 kubenswrapper[9368]: I1203 19:56:39.550528 9368 scope.go:117] "RemoveContainer" containerID="b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c" Dec 03 19:56:39.555435 master-0 kubenswrapper[9368]: I1203 19:56:39.555383 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" event={"ID":"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e","Type":"ContainerStarted","Data":"36d664599448b8d47d8c462d71e9206835fdec07754cadb49ede3eff874fc91e"} Dec 03 19:56:39.555488 master-0 kubenswrapper[9368]: I1203 19:56:39.555444 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" event={"ID":"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e","Type":"ContainerStarted","Data":"7311eb8e0cfeb885addad4bf6c0ceae3553a0417b770ce4938a40cee85fb2dfd"} Dec 03 19:56:39.555902 master-0 kubenswrapper[9368]: I1203 19:56:39.555876 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:39.559377 master-0 kubenswrapper[9368]: I1203 19:56:39.559331 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=7.559321405 podStartE2EDuration="7.559321405s" podCreationTimestamp="2025-12-03 19:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:39.556236617 +0000 UTC m=+65.217486528" watchObservedRunningTime="2025-12-03 19:56:39.559321405 +0000 UTC m=+65.220571316" Dec 03 19:56:39.629662 master-0 kubenswrapper[9368]: I1203 19:56:39.629589 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" podStartSLOduration=3.629569629 podStartE2EDuration="3.629569629s" podCreationTimestamp="2025-12-03 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:56:39.626573083 +0000 UTC m=+65.287823014" watchObservedRunningTime="2025-12-03 19:56:39.629569629 +0000 UTC m=+65.290819540" Dec 03 19:56:39.649119 master-0 kubenswrapper[9368]: I1203 19:56:39.649077 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:39.651394 master-0 kubenswrapper[9368]: I1203 19:56:39.651352 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68f766fc9-lwgcg"] Dec 03 19:56:39.892895 master-0 kubenswrapper[9368]: I1203 19:56:39.889944 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 19:56:40.551924 master-0 kubenswrapper[9368]: I1203 19:56:40.551832 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a5a792-4066-4863-a409-aaeb1b6ac193" path="/var/lib/kubelet/pods/93a5a792-4066-4863-a409-aaeb1b6ac193/volumes" Dec 03 19:56:40.552863 master-0 kubenswrapper[9368]: I1203 19:56:40.552533 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e727b97c-263b-430a-8502-106236863710" path="/var/lib/kubelet/pods/e727b97c-263b-430a-8502-106236863710/volumes" Dec 03 19:56:40.563214 master-0 kubenswrapper[9368]: I1203 19:56:40.563177 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a454ceb3-7801-4bf0-82d3-112ab93b687b/installer/0.log" Dec 03 19:56:40.563422 master-0 kubenswrapper[9368]: I1203 19:56:40.563227 9368 generic.go:334] "Generic (PLEG): container finished" podID="a454ceb3-7801-4bf0-82d3-112ab93b687b" containerID="53fcd11329bfb04c78021321046cdb8bf43022a895b0f92bf8ee67416f18fc5f" exitCode=1 Dec 03 19:56:40.563422 master-0 kubenswrapper[9368]: I1203 19:56:40.563284 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a454ceb3-7801-4bf0-82d3-112ab93b687b","Type":"ContainerDied","Data":"53fcd11329bfb04c78021321046cdb8bf43022a895b0f92bf8ee67416f18fc5f"} Dec 03 19:56:40.729127 master-0 kubenswrapper[9368]: I1203 19:56:40.728573 9368 scope.go:117] "RemoveContainer" containerID="b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c" Dec 03 19:56:40.729969 master-0 kubenswrapper[9368]: E1203 19:56:40.729934 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c\": container with ID starting with b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c not found: ID does not exist" containerID="b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c" Dec 03 19:56:40.730045 master-0 kubenswrapper[9368]: I1203 19:56:40.729989 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c"} err="failed to get container status \"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c\": rpc error: code = NotFound desc = could not find container \"b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c\": container with ID starting with b6cdf87e1cdb392734b594d781cade9c286b5f000271d39a984a30d2c054909c not found: ID does not exist" Dec 03 19:56:40.764237 master-0 kubenswrapper[9368]: I1203 19:56:40.764198 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_33ba32c7-9e77-419f-b417-8f1aa28ecd5d/installer/0.log" Dec 03 19:56:40.764481 master-0 kubenswrapper[9368]: I1203 19:56:40.764269 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:40.787457 master-0 kubenswrapper[9368]: I1203 19:56:40.787408 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a454ceb3-7801-4bf0-82d3-112ab93b687b/installer/0.log" Dec 03 19:56:40.788265 master-0 kubenswrapper[9368]: I1203 19:56:40.787485 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:40.884359 master-0 kubenswrapper[9368]: I1203 19:56:40.883495 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir\") pod \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " Dec 03 19:56:40.884359 master-0 kubenswrapper[9368]: I1203 19:56:40.883580 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock\") pod \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " Dec 03 19:56:40.884359 master-0 kubenswrapper[9368]: I1203 19:56:40.883658 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access\") pod \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\" (UID: \"33ba32c7-9e77-419f-b417-8f1aa28ecd5d\") " Dec 03 19:56:40.884359 master-0 kubenswrapper[9368]: I1203 19:56:40.884281 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33ba32c7-9e77-419f-b417-8f1aa28ecd5d" (UID: "33ba32c7-9e77-419f-b417-8f1aa28ecd5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:40.884359 master-0 kubenswrapper[9368]: I1203 19:56:40.884313 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock" (OuterVolumeSpecName: "var-lock") pod "33ba32c7-9e77-419f-b417-8f1aa28ecd5d" (UID: "33ba32c7-9e77-419f-b417-8f1aa28ecd5d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:40.888461 master-0 kubenswrapper[9368]: I1203 19:56:40.887004 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33ba32c7-9e77-419f-b417-8f1aa28ecd5d" (UID: "33ba32c7-9e77-419f-b417-8f1aa28ecd5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:40.984685 master-0 kubenswrapper[9368]: I1203 19:56:40.984631 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock\") pod \"a454ceb3-7801-4bf0-82d3-112ab93b687b\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " Dec 03 19:56:40.984895 master-0 kubenswrapper[9368]: I1203 19:56:40.984711 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access\") pod \"a454ceb3-7801-4bf0-82d3-112ab93b687b\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " Dec 03 19:56:40.984895 master-0 kubenswrapper[9368]: I1203 19:56:40.984733 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock" (OuterVolumeSpecName: "var-lock") pod "a454ceb3-7801-4bf0-82d3-112ab93b687b" (UID: "a454ceb3-7801-4bf0-82d3-112ab93b687b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:40.984895 master-0 kubenswrapper[9368]: I1203 19:56:40.984765 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir\") pod \"a454ceb3-7801-4bf0-82d3-112ab93b687b\" (UID: \"a454ceb3-7801-4bf0-82d3-112ab93b687b\") " Dec 03 19:56:40.984895 master-0 kubenswrapper[9368]: I1203 19:56:40.984865 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a454ceb3-7801-4bf0-82d3-112ab93b687b" (UID: "a454ceb3-7801-4bf0-82d3-112ab93b687b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:56:40.985044 master-0 kubenswrapper[9368]: I1203 19:56:40.985008 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:40.985044 master-0 kubenswrapper[9368]: I1203 19:56:40.985044 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:40.985127 master-0 kubenswrapper[9368]: I1203 19:56:40.985057 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:40.985127 master-0 kubenswrapper[9368]: I1203 19:56:40.985066 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a454ceb3-7801-4bf0-82d3-112ab93b687b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:40.985127 master-0 kubenswrapper[9368]: I1203 19:56:40.985075 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ba32c7-9e77-419f-b417-8f1aa28ecd5d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:40.987583 master-0 kubenswrapper[9368]: I1203 19:56:40.987540 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a454ceb3-7801-4bf0-82d3-112ab93b687b" (UID: "a454ceb3-7801-4bf0-82d3-112ab93b687b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:56:41.086189 master-0 kubenswrapper[9368]: I1203 19:56:41.085909 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a454ceb3-7801-4bf0-82d3-112ab93b687b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:56:41.413683 master-0 kubenswrapper[9368]: I1203 19:56:41.413638 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 19:56:41.574057 master-0 kubenswrapper[9368]: I1203 19:56:41.573992 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a454ceb3-7801-4bf0-82d3-112ab93b687b/installer/0.log" Dec 03 19:56:41.574260 master-0 kubenswrapper[9368]: I1203 19:56:41.574094 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a454ceb3-7801-4bf0-82d3-112ab93b687b","Type":"ContainerDied","Data":"22fb555625ad569a949864018978374315453d83ca6adc979cc16aef974813c6"} Dec 03 19:56:41.574260 master-0 kubenswrapper[9368]: I1203 19:56:41.574136 9368 scope.go:117] "RemoveContainer" containerID="53fcd11329bfb04c78021321046cdb8bf43022a895b0f92bf8ee67416f18fc5f" Dec 03 19:56:41.574260 master-0 kubenswrapper[9368]: I1203 19:56:41.574165 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 19:56:41.576068 master-0 kubenswrapper[9368]: I1203 19:56:41.576032 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_33ba32c7-9e77-419f-b417-8f1aa28ecd5d/installer/0.log" Dec 03 19:56:41.576240 master-0 kubenswrapper[9368]: I1203 19:56:41.576207 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"33ba32c7-9e77-419f-b417-8f1aa28ecd5d","Type":"ContainerDied","Data":"764602bbdcf44f32b31fcf6b225c7cc11878e8ab080298308eed45bd7554c4ed"} Dec 03 19:56:41.576240 master-0 kubenswrapper[9368]: I1203 19:56:41.576212 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 19:56:43.595918 master-0 kubenswrapper[9368]: I1203 19:56:43.595876 9368 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="23b4f3f34e8595251e0fdeffba36a81024e5f343e733b49e23a5e472d12bfa81" exitCode=0 Dec 03 19:56:43.595918 master-0 kubenswrapper[9368]: I1203 19:56:43.595921 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerDied","Data":"23b4f3f34e8595251e0fdeffba36a81024e5f343e733b49e23a5e472d12bfa81"} Dec 03 19:56:43.596412 master-0 kubenswrapper[9368]: I1203 19:56:43.596330 9368 scope.go:117] "RemoveContainer" containerID="23b4f3f34e8595251e0fdeffba36a81024e5f343e733b49e23a5e472d12bfa81" Dec 03 19:56:43.652919 master-0 kubenswrapper[9368]: I1203 19:56:43.652876 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: E1203 19:56:43.653069 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a454ceb3-7801-4bf0-82d3-112ab93b687b" containerName="installer" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: I1203 19:56:43.653079 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="a454ceb3-7801-4bf0-82d3-112ab93b687b" containerName="installer" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: E1203 19:56:43.653094 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e727b97c-263b-430a-8502-106236863710" containerName="controller-manager" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: I1203 19:56:43.653100 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="e727b97c-263b-430a-8502-106236863710" containerName="controller-manager" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: E1203 19:56:43.653119 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a5a792-4066-4863-a409-aaeb1b6ac193" containerName="installer" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: I1203 19:56:43.653126 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a5a792-4066-4863-a409-aaeb1b6ac193" containerName="installer" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: E1203 19:56:43.653134 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ba32c7-9e77-419f-b417-8f1aa28ecd5d" containerName="installer" Dec 03 19:56:43.653169 master-0 kubenswrapper[9368]: I1203 19:56:43.653139 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ba32c7-9e77-419f-b417-8f1aa28ecd5d" containerName="installer" Dec 03 19:56:43.653568 master-0 kubenswrapper[9368]: I1203 19:56:43.653233 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a5a792-4066-4863-a409-aaeb1b6ac193" containerName="installer" Dec 03 19:56:43.653568 master-0 kubenswrapper[9368]: I1203 19:56:43.653242 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ba32c7-9e77-419f-b417-8f1aa28ecd5d" containerName="installer" Dec 03 19:56:43.653568 master-0 kubenswrapper[9368]: I1203 19:56:43.653251 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="a454ceb3-7801-4bf0-82d3-112ab93b687b" containerName="installer" Dec 03 19:56:43.653568 master-0 kubenswrapper[9368]: I1203 19:56:43.653262 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="e727b97c-263b-430a-8502-106236863710" containerName="controller-manager" Dec 03 19:56:43.653832 master-0 kubenswrapper[9368]: I1203 19:56:43.653582 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.655039 master-0 kubenswrapper[9368]: I1203 19:56:43.655013 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 19:56:43.655735 master-0 kubenswrapper[9368]: I1203 19:56:43.655715 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 19:56:43.656678 master-0 kubenswrapper[9368]: I1203 19:56:43.656533 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 19:56:43.656678 master-0 kubenswrapper[9368]: I1203 19:56:43.656544 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 19:56:43.657433 master-0 kubenswrapper[9368]: I1203 19:56:43.657413 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 19:56:43.664147 master-0 kubenswrapper[9368]: I1203 19:56:43.664105 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 19:56:43.720603 master-0 kubenswrapper[9368]: I1203 19:56:43.720545 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.720603 master-0 kubenswrapper[9368]: I1203 19:56:43.720596 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.720921 master-0 kubenswrapper[9368]: I1203 19:56:43.720630 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.720921 master-0 kubenswrapper[9368]: I1203 19:56:43.720706 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.720921 master-0 kubenswrapper[9368]: I1203 19:56:43.720765 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.821581 master-0 kubenswrapper[9368]: I1203 19:56:43.821525 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.821581 master-0 kubenswrapper[9368]: I1203 19:56:43.821583 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.821862 master-0 kubenswrapper[9368]: I1203 19:56:43.821631 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.821862 master-0 kubenswrapper[9368]: I1203 19:56:43.821661 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.821862 master-0 kubenswrapper[9368]: I1203 19:56:43.821725 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.824332 master-0 kubenswrapper[9368]: I1203 19:56:43.824291 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.824413 master-0 kubenswrapper[9368]: I1203 19:56:43.824368 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.824467 master-0 kubenswrapper[9368]: I1203 19:56:43.824418 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:43.825363 master-0 kubenswrapper[9368]: I1203 19:56:43.825325 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:44.826304 master-0 kubenswrapper[9368]: I1203 19:56:44.825479 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 19:56:47.637064 master-0 kubenswrapper[9368]: I1203 19:56:47.636877 9368 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="e73e12ce13ca81b680321fa012f494204d85d5e6386ba40c3313c0c4756967da" exitCode=0 Dec 03 19:56:47.637064 master-0 kubenswrapper[9368]: I1203 19:56:47.636971 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerDied","Data":"e73e12ce13ca81b680321fa012f494204d85d5e6386ba40c3313c0c4756967da"} Dec 03 19:56:47.638028 master-0 kubenswrapper[9368]: I1203 19:56:47.637827 9368 scope.go:117] "RemoveContainer" containerID="e73e12ce13ca81b680321fa012f494204d85d5e6386ba40c3313c0c4756967da" Dec 03 19:56:48.645256 master-0 kubenswrapper[9368]: I1203 19:56:48.645191 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="8ee6a0b56a85c0d14ad54d2283fc55b5a9f7a55c73d41cd24b0430be03f47449" exitCode=0 Dec 03 19:56:48.645814 master-0 kubenswrapper[9368]: I1203 19:56:48.645272 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"8ee6a0b56a85c0d14ad54d2283fc55b5a9f7a55c73d41cd24b0430be03f47449"} Dec 03 19:56:48.645814 master-0 kubenswrapper[9368]: I1203 19:56:48.645790 9368 scope.go:117] "RemoveContainer" containerID="8ee6a0b56a85c0d14ad54d2283fc55b5a9f7a55c73d41cd24b0430be03f47449" Dec 03 19:56:49.600502 master-0 kubenswrapper[9368]: I1203 19:56:49.600454 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:49.669526 master-0 kubenswrapper[9368]: I1203 19:56:49.669457 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:56:52.463014 master-0 kubenswrapper[9368]: I1203 19:56:52.462886 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 19:56:53.582649 master-0 kubenswrapper[9368]: I1203 19:56:53.582591 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:53.611830 master-0 kubenswrapper[9368]: I1203 19:56:53.605392 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 19:56:53.690228 master-0 kubenswrapper[9368]: I1203 19:56:53.690176 9368 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="74b33948f209172661a41eab8dd989534e03391e2f9b3dab897af1dbb663716c" exitCode=0 Dec 03 19:56:53.690457 master-0 kubenswrapper[9368]: I1203 19:56:53.690237 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerDied","Data":"74b33948f209172661a41eab8dd989534e03391e2f9b3dab897af1dbb663716c"} Dec 03 19:56:53.690864 master-0 kubenswrapper[9368]: I1203 19:56:53.690837 9368 scope.go:117] "RemoveContainer" containerID="74b33948f209172661a41eab8dd989534e03391e2f9b3dab897af1dbb663716c" Dec 03 19:56:54.780690 master-0 kubenswrapper[9368]: I1203 19:56:54.779826 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a454ceb3-7801-4bf0-82d3-112ab93b687b" path="/var/lib/kubelet/pods/a454ceb3-7801-4bf0-82d3-112ab93b687b/volumes" Dec 03 19:56:55.786432 master-0 kubenswrapper[9368]: I1203 19:56:55.786261 9368 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="e25a90c6c614930a0aba8ebec6ee17a1bf73a834467d4ec954b7d5ad039662fb" exitCode=0 Dec 03 19:56:55.786432 master-0 kubenswrapper[9368]: I1203 19:56:55.786309 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerDied","Data":"e25a90c6c614930a0aba8ebec6ee17a1bf73a834467d4ec954b7d5ad039662fb"} Dec 03 19:56:55.787350 master-0 kubenswrapper[9368]: I1203 19:56:55.786721 9368 scope.go:117] "RemoveContainer" containerID="e25a90c6c614930a0aba8ebec6ee17a1bf73a834467d4ec954b7d5ad039662fb" Dec 03 19:56:56.939643 master-0 kubenswrapper[9368]: I1203 19:56:56.939573 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-7t8bs"] Dec 03 19:56:56.943803 master-0 kubenswrapper[9368]: I1203 19:56:56.940896 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:56.943803 master-0 kubenswrapper[9368]: I1203 19:56:56.943077 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz"] Dec 03 19:56:56.948822 master-0 kubenswrapper[9368]: I1203 19:56:56.944432 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 19:56:57.006861 master-0 kubenswrapper[9368]: I1203 19:56:57.006809 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.007051 master-0 kubenswrapper[9368]: I1203 19:56:57.006899 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.007051 master-0 kubenswrapper[9368]: I1203 19:56:57.006942 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dpx\" (UniqueName: \"kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.007051 master-0 kubenswrapper[9368]: I1203 19:56:57.007024 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.108161 master-0 kubenswrapper[9368]: I1203 19:56:57.108073 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.108161 master-0 kubenswrapper[9368]: I1203 19:56:57.108137 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.108452 master-0 kubenswrapper[9368]: I1203 19:56:57.108179 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dpx\" (UniqueName: \"kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.108452 master-0 kubenswrapper[9368]: I1203 19:56:57.108224 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.108452 master-0 kubenswrapper[9368]: I1203 19:56:57.108261 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.109212 master-0 kubenswrapper[9368]: I1203 19:56:57.109177 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.112499 master-0 kubenswrapper[9368]: I1203 19:56:57.112462 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.124053 master-0 kubenswrapper[9368]: I1203 19:56:57.124023 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dpx\" (UniqueName: \"kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:57.163862 master-0 kubenswrapper[9368]: I1203 19:56:57.162390 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:57.164749 master-0 kubenswrapper[9368]: I1203 19:56:57.164701 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 19:56:57.336457 master-0 kubenswrapper[9368]: I1203 19:56:57.336323 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 19:56:58.556845 master-0 kubenswrapper[9368]: I1203 19:56:58.556662 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ba32c7-9e77-419f-b417-8f1aa28ecd5d" path="/var/lib/kubelet/pods/33ba32c7-9e77-419f-b417-8f1aa28ecd5d/volumes" Dec 03 19:57:00.823222 master-0 kubenswrapper[9368]: I1203 19:57:00.823119 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 19:57:00.823222 master-0 kubenswrapper[9368]: I1203 19:57:00.823187 9368 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="794beba2362386c338599c102e787bfbcb667a8f297d93f341ccc297bdb73087" exitCode=1 Dec 03 19:57:00.823222 master-0 kubenswrapper[9368]: I1203 19:57:00.823220 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerDied","Data":"794beba2362386c338599c102e787bfbcb667a8f297d93f341ccc297bdb73087"} Dec 03 19:57:00.824421 master-0 kubenswrapper[9368]: I1203 19:57:00.823684 9368 scope.go:117] "RemoveContainer" containerID="794beba2362386c338599c102e787bfbcb667a8f297d93f341ccc297bdb73087" Dec 03 19:57:01.835148 master-0 kubenswrapper[9368]: I1203 19:57:01.835075 9368 generic.go:334] "Generic (PLEG): container finished" podID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" containerID="67df0016b48dcce14201ac3044aca405e44a73dd4f2748c38de589d5302c6d89" exitCode=0 Dec 03 19:57:01.835148 master-0 kubenswrapper[9368]: I1203 19:57:01.835128 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerDied","Data":"67df0016b48dcce14201ac3044aca405e44a73dd4f2748c38de589d5302c6d89"} Dec 03 19:57:01.835671 master-0 kubenswrapper[9368]: I1203 19:57:01.835633 9368 scope.go:117] "RemoveContainer" containerID="67df0016b48dcce14201ac3044aca405e44a73dd4f2748c38de589d5302c6d89" Dec 03 19:57:05.656624 master-0 kubenswrapper[9368]: I1203 19:57:05.654404 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99"] Dec 03 19:57:06.893722 master-0 kubenswrapper[9368]: I1203 19:57:06.892840 9368 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="5b669ed74eaf8bfa020c73b3caed3c1731e9f130494d0a6716eecb9c6dd302d9" exitCode=0 Dec 03 19:57:06.893722 master-0 kubenswrapper[9368]: I1203 19:57:06.892934 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerDied","Data":"5b669ed74eaf8bfa020c73b3caed3c1731e9f130494d0a6716eecb9c6dd302d9"} Dec 03 19:57:06.893722 master-0 kubenswrapper[9368]: I1203 19:57:06.893691 9368 scope.go:117] "RemoveContainer" containerID="5b669ed74eaf8bfa020c73b3caed3c1731e9f130494d0a6716eecb9c6dd302d9" Dec 03 19:57:10.079989 master-0 kubenswrapper[9368]: I1203 19:57:10.079901 9368 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 19:57:10.080925 master-0 kubenswrapper[9368]: I1203 19:57:10.080217 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" containerID="cri-o://fc327643e61db9d9337a443f21096010694e550ffc71b3be3921aca847fdd4bd" gracePeriod=30 Dec 03 19:57:10.080925 master-0 kubenswrapper[9368]: I1203 19:57:10.080241 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" containerID="cri-o://05747084f9e49c9f0d255ef42ef3e83cd2a8abb1990c562931e3ac0ccc06b877" gracePeriod=30 Dec 03 19:57:10.083439 master-0 kubenswrapper[9368]: I1203 19:57:10.083379 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: E1203 19:57:10.083608 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: I1203 19:57:10.083624 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: E1203 19:57:10.083647 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: I1203 19:57:10.083656 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: I1203 19:57:10.083802 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 19:57:10.083769 master-0 kubenswrapper[9368]: I1203 19:57:10.083831 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 19:57:10.089469 master-0 kubenswrapper[9368]: I1203 19:57:10.089424 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.227371 master-0 kubenswrapper[9368]: I1203 19:57:10.227244 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.227371 master-0 kubenswrapper[9368]: I1203 19:57:10.227348 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.227603 master-0 kubenswrapper[9368]: I1203 19:57:10.227541 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.227649 master-0 kubenswrapper[9368]: I1203 19:57:10.227614 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.227904 master-0 kubenswrapper[9368]: I1203 19:57:10.227821 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.228005 master-0 kubenswrapper[9368]: I1203 19:57:10.227977 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.328909 master-0 kubenswrapper[9368]: I1203 19:57:10.328857 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329008 master-0 kubenswrapper[9368]: I1203 19:57:10.328920 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329008 master-0 kubenswrapper[9368]: I1203 19:57:10.328975 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329074 master-0 kubenswrapper[9368]: I1203 19:57:10.329010 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329108 master-0 kubenswrapper[9368]: I1203 19:57:10.329080 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329139 master-0 kubenswrapper[9368]: I1203 19:57:10.329110 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329200 master-0 kubenswrapper[9368]: I1203 19:57:10.329171 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329239 master-0 kubenswrapper[9368]: I1203 19:57:10.329196 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329274 master-0 kubenswrapper[9368]: I1203 19:57:10.329220 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329307 master-0 kubenswrapper[9368]: I1203 19:57:10.329234 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329363 master-0 kubenswrapper[9368]: I1203 19:57:10.329289 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.329423 master-0 kubenswrapper[9368]: I1203 19:57:10.329390 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 19:57:10.678241 master-0 kubenswrapper[9368]: I1203 19:57:10.678191 9368 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 19:57:10.678520 master-0 kubenswrapper[9368]: I1203 19:57:10.678462 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" containerID="cri-o://ca335c8e4de4141862b380dce4757695adee236b409b9c589070127007153500" gracePeriod=30 Dec 03 19:57:10.681413 master-0 kubenswrapper[9368]: I1203 19:57:10.681373 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 19:57:10.681917 master-0 kubenswrapper[9368]: E1203 19:57:10.681883 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 19:57:10.682014 master-0 kubenswrapper[9368]: I1203 19:57:10.681916 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 19:57:10.682146 master-0 kubenswrapper[9368]: I1203 19:57:10.682126 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 19:57:10.684892 master-0 kubenswrapper[9368]: I1203 19:57:10.683986 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.837495 master-0 kubenswrapper[9368]: I1203 19:57:10.837410 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.837699 master-0 kubenswrapper[9368]: I1203 19:57:10.837637 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.926344 master-0 kubenswrapper[9368]: I1203 19:57:10.926176 9368 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="5368f3d8c609d03f47b3a2379952daea482ac8f810b561b93821ae543a16d61e" exitCode=0 Dec 03 19:57:10.926344 master-0 kubenswrapper[9368]: I1203 19:57:10.926240 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerDied","Data":"5368f3d8c609d03f47b3a2379952daea482ac8f810b561b93821ae543a16d61e"} Dec 03 19:57:10.927035 master-0 kubenswrapper[9368]: I1203 19:57:10.926857 9368 scope.go:117] "RemoveContainer" containerID="5368f3d8c609d03f47b3a2379952daea482ac8f810b561b93821ae543a16d61e" Dec 03 19:57:10.939164 master-0 kubenswrapper[9368]: I1203 19:57:10.939099 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.939321 master-0 kubenswrapper[9368]: I1203 19:57:10.939240 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.939321 master-0 kubenswrapper[9368]: I1203 19:57:10.939265 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:10.939494 master-0 kubenswrapper[9368]: I1203 19:57:10.939457 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:11.935339 master-0 kubenswrapper[9368]: I1203 19:57:11.935266 9368 generic.go:334] "Generic (PLEG): container finished" podID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerID="82116db57e57089f2a0aaaa865b4d91e3469d2022d11777a1eb493f1bba12223" exitCode=0 Dec 03 19:57:11.935339 master-0 kubenswrapper[9368]: I1203 19:57:11.935316 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"bacd155a-fee3-4e5e-89a2-ab86f401d2ff","Type":"ContainerDied","Data":"82116db57e57089f2a0aaaa865b4d91e3469d2022d11777a1eb493f1bba12223"} Dec 03 19:57:16.973242 master-0 kubenswrapper[9368]: I1203 19:57:16.973156 9368 generic.go:334] "Generic (PLEG): container finished" podID="d78739a7694769882b7e47ea5ac08a10" containerID="ca335c8e4de4141862b380dce4757695adee236b409b9c589070127007153500" exitCode=0 Dec 03 19:57:21.050629 master-0 kubenswrapper[9368]: E1203 19:57:21.050565 9368 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:21.068066 master-0 kubenswrapper[9368]: I1203 19:57:21.068027 9368 scope.go:117] "RemoveContainer" containerID="31625955464d6bd81373d34abab51a1683cf14b5575194ba5d77e88634d8044d" Dec 03 19:57:23.022613 master-0 kubenswrapper[9368]: I1203 19:57:23.022533 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"bacd155a-fee3-4e5e-89a2-ab86f401d2ff","Type":"ContainerDied","Data":"40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba"} Dec 03 19:57:23.022613 master-0 kubenswrapper[9368]: I1203 19:57:23.022583 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba" Dec 03 19:57:23.027803 master-0 kubenswrapper[9368]: I1203 19:57:23.027750 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 19:57:23.143747 master-0 kubenswrapper[9368]: I1203 19:57:23.143646 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:57:23.175382 master-0 kubenswrapper[9368]: W1203 19:57:23.175338 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9891cf64_59e8_4d8d_94fe_17cfa4b18c1b.slice/crio-7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9 WatchSource:0}: Error finding container 7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9: Status 404 returned error can't find the container with id 7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9 Dec 03 19:57:23.214918 master-0 kubenswrapper[9368]: I1203 19:57:23.214830 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock\") pod \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " Dec 03 19:57:23.215006 master-0 kubenswrapper[9368]: I1203 19:57:23.214919 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access\") pod \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " Dec 03 19:57:23.215006 master-0 kubenswrapper[9368]: I1203 19:57:23.214972 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir\") pod \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\" (UID: \"bacd155a-fee3-4e5e-89a2-ab86f401d2ff\") " Dec 03 19:57:23.215369 master-0 kubenswrapper[9368]: I1203 19:57:23.215235 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bacd155a-fee3-4e5e-89a2-ab86f401d2ff" (UID: "bacd155a-fee3-4e5e-89a2-ab86f401d2ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:23.215369 master-0 kubenswrapper[9368]: I1203 19:57:23.215285 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock" (OuterVolumeSpecName: "var-lock") pod "bacd155a-fee3-4e5e-89a2-ab86f401d2ff" (UID: "bacd155a-fee3-4e5e-89a2-ab86f401d2ff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:23.215475 master-0 kubenswrapper[9368]: I1203 19:57:23.215433 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:23.215475 master-0 kubenswrapper[9368]: I1203 19:57:23.215456 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:23.223567 master-0 kubenswrapper[9368]: I1203 19:57:23.223518 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bacd155a-fee3-4e5e-89a2-ab86f401d2ff" (UID: "bacd155a-fee3-4e5e-89a2-ab86f401d2ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316189 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"d78739a7694769882b7e47ea5ac08a10\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316240 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"d78739a7694769882b7e47ea5ac08a10\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316237 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets" (OuterVolumeSpecName: "secrets") pod "d78739a7694769882b7e47ea5ac08a10" (UID: "d78739a7694769882b7e47ea5ac08a10"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316312 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs" (OuterVolumeSpecName: "logs") pod "d78739a7694769882b7e47ea5ac08a10" (UID: "d78739a7694769882b7e47ea5ac08a10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316506 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacd155a-fee3-4e5e-89a2-ab86f401d2ff-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316518 9368 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:23.316544 master-0 kubenswrapper[9368]: I1203 19:57:23.316527 9368 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:23.757801 master-0 kubenswrapper[9368]: E1203 19:57:23.754484 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:23.757801 master-0 kubenswrapper[9368]: I1203 19:57:23.754976 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:23.779104 master-0 kubenswrapper[9368]: W1203 19:57:23.777123 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46583dca69d50bb12bc004d7ee3300f.slice/crio-d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c WatchSource:0}: Error finding container d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c: Status 404 returned error can't find the container with id d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c Dec 03 19:57:24.036951 master-0 kubenswrapper[9368]: I1203 19:57:24.036887 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerStarted","Data":"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.038527 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" event={"ID":"b2021db5-b27a-4e06-beec-d9ba82aa1ffc","Type":"ContainerStarted","Data":"59c8bb29d939e6fee19d611a65e483cab2fd5e6e7ad2908b9de85424d9b9adfd"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.039751 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.041442 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.049668 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"d4a5f6d7351d67e5e7a95502da59000a72543fff06341a57bb2e696e10e8a04c"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.049698 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.049712 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.050712 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" event={"ID":"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2","Type":"ContainerStarted","Data":"27bb4679301fdc074dea2c84bbcbb0508130dbd7ecbd97ed590e3e11664e205b"} Dec 03 19:57:24.052043 master-0 kubenswrapper[9368]: I1203 19:57:24.050731 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" event={"ID":"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2","Type":"ContainerStarted","Data":"f4ba1672df3b08fb7a73b6715298a794ef06dd7c6d31684f76b45134cb6f237e"} Dec 03 19:57:24.053168 master-0 kubenswrapper[9368]: I1203 19:57:24.053133 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerStarted","Data":"63e96daa282dbc7d024e787ccf340beb8400981b7e21cf7891ecde2dd88c97bf"} Dec 03 19:57:24.054410 master-0 kubenswrapper[9368]: I1203 19:57:24.054374 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" event={"ID":"b5cad72f-5bbf-42fc-9d63-545a01c98cbe","Type":"ContainerStarted","Data":"7d4b806f60c6e3a9fcd38f09aa10d060121f698a2f0e042f80f78d96aa5e5a4f"} Dec 03 19:57:24.054498 master-0 kubenswrapper[9368]: I1203 19:57:24.054466 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerName="route-controller-manager" containerID="cri-o://7d4b806f60c6e3a9fcd38f09aa10d060121f698a2f0e042f80f78d96aa5e5a4f" gracePeriod=30 Dec 03 19:57:24.055997 master-0 kubenswrapper[9368]: I1203 19:57:24.055935 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:57:24.062022 master-0 kubenswrapper[9368]: I1203 19:57:24.061705 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6"} Dec 03 19:57:24.069581 master-0 kubenswrapper[9368]: I1203 19:57:24.069209 9368 patch_prober.go:28] interesting pod/route-controller-manager-869d689b5b-brqck container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": read tcp 10.128.0.2:57778->10.128.0.49:8443: read: connection reset by peer" start-of-body= Dec 03 19:57:24.069581 master-0 kubenswrapper[9368]: I1203 19:57:24.069251 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": read tcp 10.128.0.2:57778->10.128.0.49:8443: read: connection reset by peer" Dec 03 19:57:24.071902 master-0 kubenswrapper[9368]: I1203 19:57:24.070887 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"b83bb371542ad766112edb0a7c58da2aca27befbfdc6c60b17dd4c88076fd288"} Dec 03 19:57:24.071902 master-0 kubenswrapper[9368]: I1203 19:57:24.070928 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8"} Dec 03 19:57:24.074987 master-0 kubenswrapper[9368]: I1203 19:57:24.074964 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 19:57:24.075052 master-0 kubenswrapper[9368]: I1203 19:57:24.075026 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201"} Dec 03 19:57:24.085446 master-0 kubenswrapper[9368]: I1203 19:57:24.082631 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c"} Dec 03 19:57:24.085446 master-0 kubenswrapper[9368]: I1203 19:57:24.084038 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerStarted","Data":"d5e5f345f4c7214304a5c25631f848938166f13ee76c5366965060641404f3cc"} Dec 03 19:57:24.088486 master-0 kubenswrapper[9368]: I1203 19:57:24.088448 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" event={"ID":"ad22d8ed-2476-441b-aa3b-a7845606b0ac","Type":"ContainerStarted","Data":"706b516c7f98083b7764bc273d09e6371632af415b6e8e32988726987b991300"} Dec 03 19:57:24.090026 master-0 kubenswrapper[9368]: I1203 19:57:24.089988 9368 scope.go:117] "RemoveContainer" containerID="ca335c8e4de4141862b380dce4757695adee236b409b9c589070127007153500" Dec 03 19:57:24.090103 master-0 kubenswrapper[9368]: I1203 19:57:24.089990 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:57:24.091939 master-0 kubenswrapper[9368]: I1203 19:57:24.091907 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839"} Dec 03 19:57:24.093624 master-0 kubenswrapper[9368]: I1203 19:57:24.093591 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" event={"ID":"6404bbc7-8ca9-4f20-8ce7-40f855555160","Type":"ContainerStarted","Data":"7471504fff1e11adbafaccb5de758469728b8b3a7df116558a9b2a9d9184adc0"} Dec 03 19:57:24.094877 master-0 kubenswrapper[9368]: I1203 19:57:24.094840 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" event={"ID":"cd35fc5f-07ab-4c66-9b80-33a598d417ef","Type":"ContainerStarted","Data":"3c1e1c3b72ec61f25e5c5ba7126cb944019b4e626d9bd5bb47ea3f3394e1ce57"} Dec 03 19:57:24.099803 master-0 kubenswrapper[9368]: I1203 19:57:24.096247 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerStarted","Data":"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57"} Dec 03 19:57:24.099803 master-0 kubenswrapper[9368]: I1203 19:57:24.096337 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="kube-rbac-proxy" containerID="cri-o://3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" gracePeriod=30 Dec 03 19:57:24.099803 master-0 kubenswrapper[9368]: I1203 19:57:24.096395 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="machine-approver-controller" containerID="cri-o://4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" gracePeriod=30 Dec 03 19:57:24.099803 master-0 kubenswrapper[9368]: I1203 19:57:24.099262 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75"} Dec 03 19:57:24.103834 master-0 kubenswrapper[9368]: I1203 19:57:24.100501 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0"} Dec 03 19:57:24.103834 master-0 kubenswrapper[9368]: I1203 19:57:24.102358 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee"} Dec 03 19:57:24.104864 master-0 kubenswrapper[9368]: I1203 19:57:24.104357 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerStarted","Data":"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa"} Dec 03 19:57:24.107803 master-0 kubenswrapper[9368]: I1203 19:57:24.105747 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229"} Dec 03 19:57:24.107803 master-0 kubenswrapper[9368]: I1203 19:57:24.107036 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2"} Dec 03 19:57:24.108538 master-0 kubenswrapper[9368]: I1203 19:57:24.108493 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5"} Dec 03 19:57:24.112801 master-0 kubenswrapper[9368]: I1203 19:57:24.112356 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 19:57:24.115887 master-0 kubenswrapper[9368]: I1203 19:57:24.115830 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerStarted","Data":"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02"} Dec 03 19:57:24.303824 master-0 kubenswrapper[9368]: I1203 19:57:24.303578 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:57:24.341062 master-0 kubenswrapper[9368]: I1203 19:57:24.341023 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls\") pod \"61ca5373-413c-4824-ba19-13b99c3081e4\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " Dec 03 19:57:24.341156 master-0 kubenswrapper[9368]: I1203 19:57:24.341103 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l2tt\" (UniqueName: \"kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt\") pod \"61ca5373-413c-4824-ba19-13b99c3081e4\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " Dec 03 19:57:24.341432 master-0 kubenswrapper[9368]: I1203 19:57:24.341404 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config\") pod \"61ca5373-413c-4824-ba19-13b99c3081e4\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " Dec 03 19:57:24.341480 master-0 kubenswrapper[9368]: I1203 19:57:24.341463 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config\") pod \"61ca5373-413c-4824-ba19-13b99c3081e4\" (UID: \"61ca5373-413c-4824-ba19-13b99c3081e4\") " Dec 03 19:57:24.345531 master-0 kubenswrapper[9368]: I1203 19:57:24.342520 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config" (OuterVolumeSpecName: "config") pod "61ca5373-413c-4824-ba19-13b99c3081e4" (UID: "61ca5373-413c-4824-ba19-13b99c3081e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:57:24.345531 master-0 kubenswrapper[9368]: I1203 19:57:24.342602 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "61ca5373-413c-4824-ba19-13b99c3081e4" (UID: "61ca5373-413c-4824-ba19-13b99c3081e4"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:57:24.345531 master-0 kubenswrapper[9368]: I1203 19:57:24.344712 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "61ca5373-413c-4824-ba19-13b99c3081e4" (UID: "61ca5373-413c-4824-ba19-13b99c3081e4"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:57:24.347358 master-0 kubenswrapper[9368]: I1203 19:57:24.347316 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt" (OuterVolumeSpecName: "kube-api-access-5l2tt") pod "61ca5373-413c-4824-ba19-13b99c3081e4" (UID: "61ca5373-413c-4824-ba19-13b99c3081e4"). InnerVolumeSpecName "kube-api-access-5l2tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:57:24.347557 master-0 kubenswrapper[9368]: I1203 19:57:24.347526 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l2tt\" (UniqueName: \"kubernetes.io/projected/61ca5373-413c-4824-ba19-13b99c3081e4-kube-api-access-5l2tt\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.347557 master-0 kubenswrapper[9368]: I1203 19:57:24.347552 9368 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.347637 master-0 kubenswrapper[9368]: I1203 19:57:24.347566 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ca5373-413c-4824-ba19-13b99c3081e4-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.347637 master-0 kubenswrapper[9368]: I1203 19:57:24.347578 9368 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61ca5373-413c-4824-ba19-13b99c3081e4-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.526844 master-0 kubenswrapper[9368]: I1203 19:57:24.523484 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-869d689b5b-brqck_b5cad72f-5bbf-42fc-9d63-545a01c98cbe/route-controller-manager/0.log" Dec 03 19:57:24.526844 master-0 kubenswrapper[9368]: I1203 19:57:24.523565 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550261 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d78739a7694769882b7e47ea5ac08a10" path="/var/lib/kubelet/pods/d78739a7694769882b7e47ea5ac08a10/volumes" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550539 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550619 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca\") pod \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550650 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert\") pod \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550685 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v6cx\" (UniqueName: \"kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx\") pod \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.550724 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config\") pod \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\" (UID: \"b5cad72f-5bbf-42fc-9d63-545a01c98cbe\") " Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.551549 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config" (OuterVolumeSpecName: "config") pod "b5cad72f-5bbf-42fc-9d63-545a01c98cbe" (UID: "b5cad72f-5bbf-42fc-9d63-545a01c98cbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.553138 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5cad72f-5bbf-42fc-9d63-545a01c98cbe" (UID: "b5cad72f-5bbf-42fc-9d63-545a01c98cbe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.554105 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx" (OuterVolumeSpecName: "kube-api-access-9v6cx") pod "b5cad72f-5bbf-42fc-9d63-545a01c98cbe" (UID: "b5cad72f-5bbf-42fc-9d63-545a01c98cbe"). InnerVolumeSpecName "kube-api-access-9v6cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:57:24.554838 master-0 kubenswrapper[9368]: I1203 19:57:24.554312 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5cad72f-5bbf-42fc-9d63-545a01c98cbe" (UID: "b5cad72f-5bbf-42fc-9d63-545a01c98cbe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:57:24.651888 master-0 kubenswrapper[9368]: I1203 19:57:24.651848 9368 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.651888 master-0 kubenswrapper[9368]: I1203 19:57:24.651881 9368 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.651888 master-0 kubenswrapper[9368]: I1203 19:57:24.651892 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v6cx\" (UniqueName: \"kubernetes.io/projected/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-kube-api-access-9v6cx\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:24.652028 master-0 kubenswrapper[9368]: I1203 19:57:24.651904 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cad72f-5bbf-42fc-9d63-545a01c98cbe-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:25.119138 master-0 kubenswrapper[9368]: I1203 19:57:25.119071 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-869d689b5b-brqck_b5cad72f-5bbf-42fc-9d63-545a01c98cbe/route-controller-manager/0.log" Dec 03 19:57:25.119138 master-0 kubenswrapper[9368]: I1203 19:57:25.119114 9368 generic.go:334] "Generic (PLEG): container finished" podID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerID="7d4b806f60c6e3a9fcd38f09aa10d060121f698a2f0e042f80f78d96aa5e5a4f" exitCode=255 Dec 03 19:57:25.119951 master-0 kubenswrapper[9368]: I1203 19:57:25.119193 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" Dec 03 19:57:25.122823 master-0 kubenswrapper[9368]: I1203 19:57:25.122722 9368 generic.go:334] "Generic (PLEG): container finished" podID="61ca5373-413c-4824-ba19-13b99c3081e4" containerID="4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" exitCode=0 Dec 03 19:57:25.122933 master-0 kubenswrapper[9368]: I1203 19:57:25.122834 9368 generic.go:334] "Generic (PLEG): container finished" podID="61ca5373-413c-4824-ba19-13b99c3081e4" containerID="3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" exitCode=0 Dec 03 19:57:25.122933 master-0 kubenswrapper[9368]: I1203 19:57:25.122879 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" Dec 03 19:57:25.124644 master-0 kubenswrapper[9368]: I1203 19:57:25.124597 9368 generic.go:334] "Generic (PLEG): container finished" podID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerID="6734488c6ce6905e5e770b668e83066dd3b8267a0d3cf0d97567edcd50a10461" exitCode=0 Dec 03 19:57:25.126114 master-0 kubenswrapper[9368]: I1203 19:57:25.126081 9368 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="384902c9d5118b992b516df4665219d1bebf7324327cde78b939566df8720f4b" exitCode=0 Dec 03 19:57:25.128146 master-0 kubenswrapper[9368]: I1203 19:57:25.128094 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/kube-rbac-proxy/0.log" Dec 03 19:57:25.128967 master-0 kubenswrapper[9368]: I1203 19:57:25.128912 9368 generic.go:334] "Generic (PLEG): container finished" podID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerID="915fbda281e49c1b3d5c238f4642cee7ff396ff14f35312879fbf5a135ba0426" exitCode=1 Dec 03 19:57:25.129201 master-0 kubenswrapper[9368]: I1203 19:57:25.129123 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="cluster-cloud-controller-manager" containerID="cri-o://d5e5f345f4c7214304a5c25631f848938166f13ee76c5366965060641404f3cc" gracePeriod=30 Dec 03 19:57:25.130109 master-0 kubenswrapper[9368]: I1203 19:57:25.129396 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="config-sync-controllers" containerID="cri-o://2eacfef43cc179b008d0d050644e4aa26e93edb95b342c88f74321432bd7fc00" gracePeriod=30 Dec 03 19:57:25.137449 master-0 kubenswrapper[9368]: I1203 19:57:25.137405 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" exitCode=1 Dec 03 19:57:26.146189 master-0 kubenswrapper[9368]: I1203 19:57:26.146121 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/kube-rbac-proxy/0.log" Dec 03 19:57:26.147092 master-0 kubenswrapper[9368]: I1203 19:57:26.147016 9368 generic.go:334] "Generic (PLEG): container finished" podID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerID="2eacfef43cc179b008d0d050644e4aa26e93edb95b342c88f74321432bd7fc00" exitCode=0 Dec 03 19:57:30.126744 master-0 kubenswrapper[9368]: I1203 19:57:30.126663 9368 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6zrxk" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="registry-server" probeResult="failure" output=< Dec 03 19:57:30.126744 master-0 kubenswrapper[9368]: timeout: failed to connect service ":50051" within 1s Dec 03 19:57:30.126744 master-0 kubenswrapper[9368]: > Dec 03 19:57:31.051457 master-0 kubenswrapper[9368]: E1203 19:57:31.051339 9368 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:32.422451 master-0 kubenswrapper[9368]: I1203 19:57:32.422347 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:32.845942 master-0 kubenswrapper[9368]: I1203 19:57:32.845759 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:38.131075 master-0 kubenswrapper[9368]: E1203 19:57:38.130948 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:57:38.250194 master-0 kubenswrapper[9368]: I1203 19:57:38.250084 9368 generic.go:334] "Generic (PLEG): container finished" podID="41b95a38663dd6fe34e183818a475977" containerID="05747084f9e49c9f0d255ef42ef3e83cd2a8abb1990c562931e3ac0ccc06b877" exitCode=0 Dec 03 19:57:40.775752 master-0 kubenswrapper[9368]: E1203 19:57:40.775495 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:57:30Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:57:30Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:57:30Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:57:30Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734},{\\\"names\\\":[],\\\"sizeBytes\\\":461716546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\\\"],\\\"sizeBytes\\\":459566623},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:607e31ebb2c85f53775455b38a607a68cb2bdab1e369f03c57e715a4ebb88831\\\"],\\\"sizeBytes\\\":458183681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36fa1378b9c26de6d45187b1e7352f3b1147109427fab3669b107d81fd967601\\\"],\\\"sizeBytes\\\":452603646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e\\\"],\\\"sizeBytes\\\":451053419},{\\\"names\\\":[],\\\"sizeBytes\\\":450855746},{\\\"names\\\":[],\\\"sizeBytes\\\":449985691}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:41.052613 master-0 kubenswrapper[9368]: E1203 19:57:41.052481 9368 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:41.278881 master-0 kubenswrapper[9368]: I1203 19:57:41.278696 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 19:57:41.278881 master-0 kubenswrapper[9368]: I1203 19:57:41.278747 9368 generic.go:334] "Generic (PLEG): container finished" podID="41b95a38663dd6fe34e183818a475977" containerID="fc327643e61db9d9337a443f21096010694e550ffc71b3be3921aca847fdd4bd" exitCode=137 Dec 03 19:57:41.895735 master-0 kubenswrapper[9368]: I1203 19:57:41.895669 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 19:57:41.896316 master-0 kubenswrapper[9368]: I1203 19:57:41.895760 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:57:42.005334 master-0 kubenswrapper[9368]: I1203 19:57:42.005222 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"41b95a38663dd6fe34e183818a475977\" (UID: \"41b95a38663dd6fe34e183818a475977\") " Dec 03 19:57:42.005590 master-0 kubenswrapper[9368]: I1203 19:57:42.005325 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"41b95a38663dd6fe34e183818a475977\" (UID: \"41b95a38663dd6fe34e183818a475977\") " Dec 03 19:57:42.005590 master-0 kubenswrapper[9368]: I1203 19:57:42.005334 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs" (OuterVolumeSpecName: "certs") pod "41b95a38663dd6fe34e183818a475977" (UID: "41b95a38663dd6fe34e183818a475977"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:42.005590 master-0 kubenswrapper[9368]: I1203 19:57:42.005369 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir" (OuterVolumeSpecName: "data-dir") pod "41b95a38663dd6fe34e183818a475977" (UID: "41b95a38663dd6fe34e183818a475977"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:57:42.006495 master-0 kubenswrapper[9368]: I1203 19:57:42.006431 9368 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:42.006631 master-0 kubenswrapper[9368]: I1203 19:57:42.006506 9368 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:57:42.293244 master-0 kubenswrapper[9368]: I1203 19:57:42.293025 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 19:57:42.293433 master-0 kubenswrapper[9368]: I1203 19:57:42.293252 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:57:42.421691 master-0 kubenswrapper[9368]: I1203 19:57:42.421582 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:42.846269 master-0 kubenswrapper[9368]: I1203 19:57:42.846138 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:44.123682 master-0 kubenswrapper[9368]: E1203 19:57:44.123578 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 19:57:44.124499 master-0 kubenswrapper[9368]: I1203 19:57:44.124304 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 19:57:44.166718 master-0 kubenswrapper[9368]: W1203 19:57:44.166545 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd8b778e190b1975a0a8fad534da6dd.slice/crio-1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788 WatchSource:0}: Error finding container 1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788: Status 404 returned error can't find the container with id 1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788 Dec 03 19:57:44.669125 master-0 kubenswrapper[9368]: E1203 19:57:44.668815 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dccd1b67174be openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:10.080222398 +0000 UTC m=+95.741472319,LastTimestamp:2025-12-03 19:57:10.080222398 +0000 UTC m=+95.741472319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:57:45.327810 master-0 kubenswrapper[9368]: I1203 19:57:45.327677 9368 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="121d9626cd0411e9b91e157dd5da2678c7631550b10f391133d8192123b5c231" exitCode=0 Dec 03 19:57:50.778220 master-0 kubenswrapper[9368]: E1203 19:57:50.778012 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:51.053111 master-0 kubenswrapper[9368]: E1203 19:57:51.052846 9368 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:57:52.422303 master-0 kubenswrapper[9368]: I1203 19:57:52.422229 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:52.845877 master-0 kubenswrapper[9368]: I1203 19:57:52.845655 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:57:53.384272 master-0 kubenswrapper[9368]: I1203 19:57:53.384223 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 19:57:53.384272 master-0 kubenswrapper[9368]: I1203 19:57:53.384268 9368 generic.go:334] "Generic (PLEG): container finished" podID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" containerID="00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389" exitCode=1 Dec 03 19:57:58.553677 master-0 kubenswrapper[9368]: E1203 19:57:58.553615 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:57:58.553677 master-0 kubenswrapper[9368]: I1203 19:57:58.553671 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 19:57:59.430411 master-0 kubenswrapper[9368]: I1203 19:57:59.430320 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/kube-rbac-proxy/0.log" Dec 03 19:58:00.779165 master-0 kubenswrapper[9368]: E1203 19:58:00.779098 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:01.053705 master-0 kubenswrapper[9368]: E1203 19:58:01.053546 9368 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:01.054063 master-0 kubenswrapper[9368]: I1203 19:58:01.054030 9368 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 19:58:01.567304 master-0 kubenswrapper[9368]: I1203 19:58:01.567254 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/kube-rbac-proxy/0.log" Dec 03 19:58:01.568280 master-0 kubenswrapper[9368]: I1203 19:58:01.568247 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/cluster-cloud-controller-manager/0.log" Dec 03 19:58:01.568348 master-0 kubenswrapper[9368]: I1203 19:58:01.568337 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:58:01.673010 master-0 kubenswrapper[9368]: I1203 19:58:01.672967 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images\") pod \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " Dec 03 19:58:01.673207 master-0 kubenswrapper[9368]: I1203 19:58:01.673037 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls\") pod \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " Dec 03 19:58:01.673207 master-0 kubenswrapper[9368]: I1203 19:58:01.673081 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgjkp\" (UniqueName: \"kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp\") pod \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " Dec 03 19:58:01.673207 master-0 kubenswrapper[9368]: I1203 19:58:01.673124 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config\") pod \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " Dec 03 19:58:01.673207 master-0 kubenswrapper[9368]: I1203 19:58:01.673197 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube\") pod \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\" (UID: \"61b16a8a-27a2-4a07-b5f9-10a5be2ec870\") " Dec 03 19:58:01.673473 master-0 kubenswrapper[9368]: I1203 19:58:01.673448 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "61b16a8a-27a2-4a07-b5f9-10a5be2ec870" (UID: "61b16a8a-27a2-4a07-b5f9-10a5be2ec870"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:58:01.673559 master-0 kubenswrapper[9368]: I1203 19:58:01.673512 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images" (OuterVolumeSpecName: "images") pod "61b16a8a-27a2-4a07-b5f9-10a5be2ec870" (UID: "61b16a8a-27a2-4a07-b5f9-10a5be2ec870"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:58:01.673940 master-0 kubenswrapper[9368]: I1203 19:58:01.673921 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "61b16a8a-27a2-4a07-b5f9-10a5be2ec870" (UID: "61b16a8a-27a2-4a07-b5f9-10a5be2ec870"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:58:01.676128 master-0 kubenswrapper[9368]: I1203 19:58:01.676083 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "61b16a8a-27a2-4a07-b5f9-10a5be2ec870" (UID: "61b16a8a-27a2-4a07-b5f9-10a5be2ec870"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:58:01.677457 master-0 kubenswrapper[9368]: I1203 19:58:01.677392 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp" (OuterVolumeSpecName: "kube-api-access-rgjkp") pod "61b16a8a-27a2-4a07-b5f9-10a5be2ec870" (UID: "61b16a8a-27a2-4a07-b5f9-10a5be2ec870"). InnerVolumeSpecName "kube-api-access-rgjkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:58:01.774214 master-0 kubenswrapper[9368]: I1203 19:58:01.774047 9368 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Dec 03 19:58:01.774214 master-0 kubenswrapper[9368]: I1203 19:58:01.774088 9368 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-images\") on node \"master-0\" DevicePath \"\"" Dec 03 19:58:01.774214 master-0 kubenswrapper[9368]: I1203 19:58:01.774101 9368 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 19:58:01.774214 master-0 kubenswrapper[9368]: I1203 19:58:01.774117 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgjkp\" (UniqueName: \"kubernetes.io/projected/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-kube-api-access-rgjkp\") on node \"master-0\" DevicePath \"\"" Dec 03 19:58:01.774214 master-0 kubenswrapper[9368]: I1203 19:58:01.774130 9368 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61b16a8a-27a2-4a07-b5f9-10a5be2ec870-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 03 19:58:02.130020 master-0 kubenswrapper[9368]: I1203 19:58:02.129944 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/cluster-cloud-controller-manager/0.log" Dec 03 19:58:02.130839 master-0 kubenswrapper[9368]: I1203 19:58:02.130027 9368 generic.go:334] "Generic (PLEG): container finished" podID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerID="d5e5f345f4c7214304a5c25631f848938166f13ee76c5366965060641404f3cc" exitCode=137 Dec 03 19:58:02.466144 master-0 kubenswrapper[9368]: I1203 19:58:02.465965 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Dec 03 19:58:02.466144 master-0 kubenswrapper[9368]: I1203 19:58:02.466056 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Dec 03 19:58:02.846376 master-0 kubenswrapper[9368]: I1203 19:58:02.846196 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:03.140405 master-0 kubenswrapper[9368]: I1203 19:58:03.140312 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/kube-rbac-proxy/0.log" Dec 03 19:58:03.142276 master-0 kubenswrapper[9368]: I1203 19:58:03.142210 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-76f56467d7-npd99_61b16a8a-27a2-4a07-b5f9-10a5be2ec870/cluster-cloud-controller-manager/0.log" Dec 03 19:58:03.142462 master-0 kubenswrapper[9368]: I1203 19:58:03.142411 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" Dec 03 19:58:06.167056 master-0 kubenswrapper[9368]: I1203 19:58:06.166965 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r2kpn_c4d45235-fb1a-4626-a41e-b1e34f7bf76e/approver/0.log" Dec 03 19:58:06.168012 master-0 kubenswrapper[9368]: I1203 19:58:06.167687 9368 generic.go:334] "Generic (PLEG): container finished" podID="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" containerID="65f13f5f310f6f953b71a1a783c24c03bd5eb6d2106c3ba74515208177e8e054" exitCode=1 Dec 03 19:58:10.780263 master-0 kubenswrapper[9368]: E1203 19:58:10.779996 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:10.929294 master-0 kubenswrapper[9368]: I1203 19:58:10.929203 9368 status_manager.go:851] "Failed to get status for pod" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-operator-b5dddf8f5-79ccj)" Dec 03 19:58:11.055255 master-0 kubenswrapper[9368]: E1203 19:58:11.055041 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 03 19:58:12.464721 master-0 kubenswrapper[9368]: I1203 19:58:12.464607 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Dec 03 19:58:12.464721 master-0 kubenswrapper[9368]: I1203 19:58:12.464702 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Dec 03 19:58:12.845929 master-0 kubenswrapper[9368]: I1203 19:58:12.845711 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:18.671725 master-0 kubenswrapper[9368]: E1203 19:58:18.671538 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dccd1da19a9e5 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Killing,Message:Stopping container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:10.678448613 +0000 UTC m=+96.339698524,LastTimestamp:2025-12-03 19:57:10.678448613 +0000 UTC m=+96.339698524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:58:20.780946 master-0 kubenswrapper[9368]: E1203 19:58:20.780836 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:20.780946 master-0 kubenswrapper[9368]: E1203 19:58:20.780898 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 19:58:21.257084 master-0 kubenswrapper[9368]: E1203 19:58:21.256970 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 03 19:58:21.275005 master-0 kubenswrapper[9368]: I1203 19:58:21.274944 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_9afe01c7-825c-43d1-8425-0317cdde11d6/installer/0.log" Dec 03 19:58:21.275172 master-0 kubenswrapper[9368]: I1203 19:58:21.275012 9368 generic.go:334] "Generic (PLEG): container finished" podID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerID="7defd583f52b28f4c8a42f8533bc6a235b9b9753c15d53b3d581070bd6b239c4" exitCode=1 Dec 03 19:58:21.278096 master-0 kubenswrapper[9368]: I1203 19:58:21.278044 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_186cc14f-5f58-43ca-8ffa-db07606ff0f7/installer/0.log" Dec 03 19:58:21.278251 master-0 kubenswrapper[9368]: I1203 19:58:21.278100 9368 generic.go:334] "Generic (PLEG): container finished" podID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerID="5217957523f4b5166716d8ff3b268cfc1e054e38ab89fcd916d9adc0a629dce1" exitCode=1 Dec 03 19:58:22.463579 master-0 kubenswrapper[9368]: I1203 19:58:22.463473 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Dec 03 19:58:22.463579 master-0 kubenswrapper[9368]: I1203 19:58:22.463552 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Dec 03 19:58:22.845404 master-0 kubenswrapper[9368]: I1203 19:58:22.845154 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:23.788755 master-0 kubenswrapper[9368]: E1203 19:58:23.788704 9368 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 19:58:23.788755 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c" Netns:"/var/run/netns/3ec21b43-da13-43f2-84be-77b6ce1890ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:58:23.788755 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:58:23.788755 master-0 kubenswrapper[9368]: > Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: E1203 19:58:23.788788 9368 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c" Netns:"/var/run/netns/3ec21b43-da13-43f2-84be-77b6ce1890ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: E1203 19:58:23.788806 9368 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c" Netns:"/var/run/netns/3ec21b43-da13-43f2-84be-77b6ce1890ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:58:23.789684 master-0 kubenswrapper[9368]: E1203 19:58:23.788856 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c\\\" Netns:\\\"/var/run/netns/3ec21b43-da13-43f2-84be-77b6ce1890ec\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=8bf3bda2616fa26542168c4dee137a64193300df11593d1ec752de4890ad3d3c;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" Dec 03 19:58:24.304920 master-0 kubenswrapper[9368]: I1203 19:58:24.304620 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 19:58:24.304920 master-0 kubenswrapper[9368]: I1203 19:58:24.304697 9368 generic.go:334] "Generic (PLEG): container finished" podID="433c3273-c99e-4d68-befc-06f92d2fc8d5" containerID="95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8" exitCode=1 Dec 03 19:58:24.304920 master-0 kubenswrapper[9368]: I1203 19:58:24.304839 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:58:24.305483 master-0 kubenswrapper[9368]: I1203 19:58:24.305398 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:58:25.315075 master-0 kubenswrapper[9368]: I1203 19:58:25.314970 9368 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0" exitCode=0 Dec 03 19:58:31.658651 master-0 kubenswrapper[9368]: E1203 19:58:31.658535 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 03 19:58:32.556361 master-0 kubenswrapper[9368]: E1203 19:58:32.556304 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:58:32.557075 master-0 kubenswrapper[9368]: E1203 19:58:32.557038 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.014s" Dec 03 19:58:32.557156 master-0 kubenswrapper[9368]: I1203 19:58:32.557075 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" event={"ID":"b5cad72f-5bbf-42fc-9d63-545a01c98cbe","Type":"ContainerDied","Data":"7d4b806f60c6e3a9fcd38f09aa10d060121f698a2f0e042f80f78d96aa5e5a4f"} Dec 03 19:58:32.557156 master-0 kubenswrapper[9368]: I1203 19:58:32.557112 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:58:32.557297 master-0 kubenswrapper[9368]: I1203 19:58:32.557230 9368 scope.go:117] "RemoveContainer" containerID="7d4b806f60c6e3a9fcd38f09aa10d060121f698a2f0e042f80f78d96aa5e5a4f" Dec 03 19:58:32.566347 master-0 kubenswrapper[9368]: I1203 19:58:32.566273 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b95a38663dd6fe34e183818a475977" path="/var/lib/kubelet/pods/41b95a38663dd6fe34e183818a475977/volumes" Dec 03 19:58:32.567059 master-0 kubenswrapper[9368]: I1203 19:58:32.567019 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 19:58:32.846510 master-0 kubenswrapper[9368]: I1203 19:58:32.846267 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:41.036330 master-0 kubenswrapper[9368]: E1203 19:58:41.036084 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:58:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:58:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:58:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:58:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:42.460053 master-0 kubenswrapper[9368]: E1203 19:58:42.459953 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 03 19:58:42.845846 master-0 kubenswrapper[9368]: I1203 19:58:42.845579 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:51.037576 master-0 kubenswrapper[9368]: E1203 19:58:51.037462 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:58:52.674752 master-0 kubenswrapper[9368]: E1203 19:58:52.674605 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-operator-b5dddf8f5-79ccj.187dccd3192039c4 openshift-kube-controller-manager-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager-operator,Name:kube-controller-manager-operator-b5dddf8f5-79ccj,UID:e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3,APIVersion:v1,ResourceVersion:3929,FieldPath:spec.containers{kube-controller-manager-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:16.030810564 +0000 UTC m=+101.692060475,LastTimestamp:2025-12-03 19:57:16.030810564 +0000 UTC m=+101.692060475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:58:52.846330 master-0 kubenswrapper[9368]: I1203 19:58:52.846191 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:58:54.060756 master-0 kubenswrapper[9368]: E1203 19:58:54.060636 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 19:58:54.532169 master-0 kubenswrapper[9368]: I1203 19:58:54.532060 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/1.log" Dec 03 19:58:54.533108 master-0 kubenswrapper[9368]: I1203 19:58:54.533020 9368 generic.go:334] "Generic (PLEG): container finished" podID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" containerID="654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee" exitCode=255 Dec 03 19:58:54.536268 master-0 kubenswrapper[9368]: I1203 19:58:54.536207 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 19:58:54.536408 master-0 kubenswrapper[9368]: I1203 19:58:54.536285 9368 generic.go:334] "Generic (PLEG): container finished" podID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" containerID="744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2" exitCode=255 Dec 03 19:58:54.539476 master-0 kubenswrapper[9368]: I1203 19:58:54.539410 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 19:58:54.540290 master-0 kubenswrapper[9368]: I1203 19:58:54.540239 9368 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6" exitCode=255 Dec 03 19:58:54.543275 master-0 kubenswrapper[9368]: I1203 19:58:54.543228 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 19:58:54.544090 master-0 kubenswrapper[9368]: I1203 19:58:54.544019 9368 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75" exitCode=255 Dec 03 19:58:54.546959 master-0 kubenswrapper[9368]: I1203 19:58:54.546920 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 19:58:54.547687 master-0 kubenswrapper[9368]: I1203 19:58:54.547632 9368 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839" exitCode=255 Dec 03 19:58:54.550695 master-0 kubenswrapper[9368]: I1203 19:58:54.550637 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 19:58:54.551924 master-0 kubenswrapper[9368]: I1203 19:58:54.551867 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 19:58:54.552018 master-0 kubenswrapper[9368]: I1203 19:58:54.551957 9368 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201" exitCode=255 Dec 03 19:58:54.554361 master-0 kubenswrapper[9368]: I1203 19:58:54.554301 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 19:58:54.554961 master-0 kubenswrapper[9368]: I1203 19:58:54.554912 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229" exitCode=255 Dec 03 19:58:54.557037 master-0 kubenswrapper[9368]: I1203 19:58:54.556992 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 19:58:54.557516 master-0 kubenswrapper[9368]: I1203 19:58:54.557463 9368 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5" exitCode=255 Dec 03 19:58:54.559248 master-0 kubenswrapper[9368]: I1203 19:58:54.559199 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 19:58:54.559721 master-0 kubenswrapper[9368]: I1203 19:58:54.559671 9368 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249" exitCode=255 Dec 03 19:58:54.561817 master-0 kubenswrapper[9368]: I1203 19:58:54.561742 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 19:58:54.562568 master-0 kubenswrapper[9368]: I1203 19:58:54.562504 9368 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7" exitCode=255 Dec 03 19:59:01.038913 master-0 kubenswrapper[9368]: E1203 19:59:01.038813 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:59:01.417181 master-0 kubenswrapper[9368]: I1203 19:59:01.417114 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:01.417482 master-0 kubenswrapper[9368]: I1203 19:59:01.417188 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:01.417482 master-0 kubenswrapper[9368]: I1203 19:59:01.417237 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:01.417482 master-0 kubenswrapper[9368]: I1203 19:59:01.417286 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:01.611604 master-0 kubenswrapper[9368]: I1203 19:59:01.611541 9368 generic.go:334] "Generic (PLEG): container finished" podID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerID="efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6" exitCode=0 Dec 03 19:59:02.845733 master-0 kubenswrapper[9368]: I1203 19:59:02.845646 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:59:06.570238 master-0 kubenswrapper[9368]: E1203 19:59:06.570094 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 19:59:06.570238 master-0 kubenswrapper[9368]: I1203 19:59:06.570172 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 19:59:07.261566 master-0 kubenswrapper[9368]: E1203 19:59:07.261437 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 03 19:59:10.931149 master-0 kubenswrapper[9368]: I1203 19:59:10.931011 9368 status_manager.go:851] "Failed to get status for pod" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods machine-approver-5775bfbf6d-psrtz)" Dec 03 19:59:11.039507 master-0 kubenswrapper[9368]: E1203 19:59:11.039366 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:59:11.417387 master-0 kubenswrapper[9368]: I1203 19:59:11.417299 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:11.417688 master-0 kubenswrapper[9368]: I1203 19:59:11.417311 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:11.417688 master-0 kubenswrapper[9368]: I1203 19:59:11.417415 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:11.417688 master-0 kubenswrapper[9368]: I1203 19:59:11.417467 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:12.689005 master-0 kubenswrapper[9368]: I1203 19:59:12.688922 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-vkcnf_73b7027e-44f5-4c7b-9226-585a90530535/manager/0.log" Dec 03 19:59:12.690111 master-0 kubenswrapper[9368]: I1203 19:59:12.689022 9368 generic.go:334] "Generic (PLEG): container finished" podID="73b7027e-44f5-4c7b-9226-585a90530535" containerID="3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2" exitCode=1 Dec 03 19:59:12.692276 master-0 kubenswrapper[9368]: I1203 19:59:12.692205 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/manager/0.log" Dec 03 19:59:12.692957 master-0 kubenswrapper[9368]: I1203 19:59:12.692898 9368 generic.go:334] "Generic (PLEG): container finished" podID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerID="026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd" exitCode=1 Dec 03 19:59:12.846197 master-0 kubenswrapper[9368]: I1203 19:59:12.846109 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:59:15.717758 master-0 kubenswrapper[9368]: I1203 19:59:15.717676 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 19:59:15.717758 master-0 kubenswrapper[9368]: I1203 19:59:15.717749 9368 generic.go:334] "Generic (PLEG): container finished" podID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" containerID="0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b" exitCode=1 Dec 03 19:59:17.815094 master-0 kubenswrapper[9368]: I1203 19:59:17.814974 9368 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-vkcnf container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" start-of-body= Dec 03 19:59:17.815094 master-0 kubenswrapper[9368]: I1203 19:59:17.815062 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podUID="73b7027e-44f5-4c7b-9226-585a90530535" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" Dec 03 19:59:17.816041 master-0 kubenswrapper[9368]: I1203 19:59:17.815130 9368 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-vkcnf container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.38:8081/healthz\": dial tcp 10.128.0.38:8081: connect: connection refused" start-of-body= Dec 03 19:59:17.816041 master-0 kubenswrapper[9368]: I1203 19:59:17.815262 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podUID="73b7027e-44f5-4c7b-9226-585a90530535" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.38:8081/healthz\": dial tcp 10.128.0.38:8081: connect: connection refused" Dec 03 19:59:17.879695 master-0 kubenswrapper[9368]: I1203 19:59:17.879616 9368 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-xfv5j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Dec 03 19:59:17.879966 master-0 kubenswrapper[9368]: I1203 19:59:17.879702 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podUID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" Dec 03 19:59:17.880187 master-0 kubenswrapper[9368]: I1203 19:59:17.880111 9368 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-xfv5j container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Dec 03 19:59:17.880264 master-0 kubenswrapper[9368]: I1203 19:59:17.880222 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podUID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" Dec 03 19:59:21.040453 master-0 kubenswrapper[9368]: E1203 19:59:21.040377 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:59:21.040453 master-0 kubenswrapper[9368]: E1203 19:59:21.040446 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 19:59:21.419689 master-0 kubenswrapper[9368]: I1203 19:59:21.419613 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:21.419689 master-0 kubenswrapper[9368]: I1203 19:59:21.419644 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:21.419689 master-0 kubenswrapper[9368]: I1203 19:59:21.419681 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:21.420211 master-0 kubenswrapper[9368]: I1203 19:59:21.419698 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:22.846247 master-0 kubenswrapper[9368]: I1203 19:59:22.845990 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:59:23.663569 master-0 kubenswrapper[9368]: E1203 19:59:23.663488 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 19:59:25.051222 master-0 kubenswrapper[9368]: E1203 19:59:25.051132 9368 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 19:59:25.051222 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8" Netns:"/var/run/netns/38e578f0-4c07-4113-9e97-c2885a743937" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:59:25.051222 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:59:25.051222 master-0 kubenswrapper[9368]: > Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: E1203 19:59:25.051253 9368 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8" Netns:"/var/run/netns/38e578f0-4c07-4113-9e97-c2885a743937" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: E1203 19:59:25.051293 9368 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8" Netns:"/var/run/netns/38e578f0-4c07-4113-9e97-c2885a743937" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:59:25.052182 master-0 kubenswrapper[9368]: E1203 19:59:25.051395 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8\\\" Netns:\\\"/var/run/netns/38e578f0-4c07-4113-9e97-c2885a743937\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=7d6b4aef2f3efe73080bc8a5f6c94b8c01ea3c84588a0eaaa62c65cd034da0b8;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" Dec 03 19:59:26.677589 master-0 kubenswrapper[9368]: E1203 19:59:26.677369 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{control-plane-machine-set-operator-66f4cc99d4-2llfg.187dccd443852ca8 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:control-plane-machine-set-operator-66f4cc99d4-2llfg,UID:cd35fc5f-07ab-4c66-9b80-33a598d417ef,APIVersion:v1,ResourceVersion:7677,FieldPath:spec.containers{control-plane-machine-set-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\" in 43.681s (43.681s including waiting). Image size: 465158513 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:21.037036712 +0000 UTC m=+106.698286623,LastTimestamp:2025-12-03 19:57:21.037036712 +0000 UTC m=+106.698286623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 19:59:27.338233 master-0 kubenswrapper[9368]: I1203 19:59:27.338148 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:59:27.338528 master-0 kubenswrapper[9368]: I1203 19:59:27.338235 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:59:27.814482 master-0 kubenswrapper[9368]: I1203 19:59:27.814416 9368 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-vkcnf container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" start-of-body= Dec 03 19:59:27.814971 master-0 kubenswrapper[9368]: I1203 19:59:27.814500 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podUID="73b7027e-44f5-4c7b-9226-585a90530535" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" Dec 03 19:59:27.880010 master-0 kubenswrapper[9368]: I1203 19:59:27.879925 9368 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-xfv5j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Dec 03 19:59:27.880257 master-0 kubenswrapper[9368]: I1203 19:59:27.880019 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podUID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" Dec 03 19:59:31.417454 master-0 kubenswrapper[9368]: I1203 19:59:31.417368 9368 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-xqvv6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 19:59:31.418332 master-0 kubenswrapper[9368]: I1203 19:59:31.417464 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" podUID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 19:59:32.846250 master-0 kubenswrapper[9368]: I1203 19:59:32.846127 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 19:59:37.703520 master-0 kubenswrapper[9368]: I1203 19:59:37.703124 9368 scope.go:117] "RemoveContainer" containerID="05747084f9e49c9f0d255ef42ef3e83cd2a8abb1990c562931e3ac0ccc06b877" Dec 03 19:59:37.728815 master-0 kubenswrapper[9368]: I1203 19:59:37.728710 9368 scope.go:117] "RemoveContainer" containerID="fc327643e61db9d9337a443f21096010694e550ffc71b3be3921aca847fdd4bd" Dec 03 19:59:37.817537 master-0 kubenswrapper[9368]: I1203 19:59:37.817400 9368 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-vkcnf container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" start-of-body= Dec 03 19:59:37.817899 master-0 kubenswrapper[9368]: I1203 19:59:37.817545 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podUID="73b7027e-44f5-4c7b-9226-585a90530535" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.38:8081/readyz\": dial tcp 10.128.0.38:8081: connect: connection refused" Dec 03 19:59:37.817899 master-0 kubenswrapper[9368]: I1203 19:59:37.817667 9368 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-vkcnf container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.38:8081/healthz\": dial tcp 10.128.0.38:8081: connect: connection refused" start-of-body= Dec 03 19:59:37.817899 master-0 kubenswrapper[9368]: I1203 19:59:37.817770 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" podUID="73b7027e-44f5-4c7b-9226-585a90530535" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.38:8081/healthz\": dial tcp 10.128.0.38:8081: connect: connection refused" Dec 03 19:59:37.880961 master-0 kubenswrapper[9368]: I1203 19:59:37.880889 9368 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-xfv5j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Dec 03 19:59:37.881244 master-0 kubenswrapper[9368]: I1203 19:59:37.880885 9368 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-xfv5j container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Dec 03 19:59:37.881244 master-0 kubenswrapper[9368]: I1203 19:59:37.881062 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podUID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" Dec 03 19:59:37.881244 master-0 kubenswrapper[9368]: I1203 19:59:37.880981 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" podUID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/readyz\": dial tcp 10.128.0.39:8081: connect: connection refused" Dec 03 19:59:40.573550 master-0 kubenswrapper[9368]: E1203 19:59:40.573430 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 19:59:40.574825 master-0 kubenswrapper[9368]: E1203 19:59:40.573746 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.017s" Dec 03 19:59:40.574825 master-0 kubenswrapper[9368]: I1203 19:59:40.573825 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sp868" Dec 03 19:59:40.574825 master-0 kubenswrapper[9368]: I1203 19:59:40.574474 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:59:40.576537 master-0 kubenswrapper[9368]: I1203 19:59:40.576480 9368 scope.go:117] "RemoveContainer" containerID="12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201" Dec 03 19:59:40.576923 master-0 kubenswrapper[9368]: I1203 19:59:40.576885 9368 scope.go:117] "RemoveContainer" containerID="efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6" Dec 03 19:59:40.577610 master-0 kubenswrapper[9368]: I1203 19:59:40.577499 9368 scope.go:117] "RemoveContainer" containerID="00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389" Dec 03 19:59:40.578522 master-0 kubenswrapper[9368]: I1203 19:59:40.578140 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 19:59:40.578655 master-0 kubenswrapper[9368]: I1203 19:59:40.578593 9368 scope.go:117] "RemoveContainer" containerID="d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839" Dec 03 19:59:40.578951 master-0 kubenswrapper[9368]: I1203 19:59:40.578916 9368 scope.go:117] "RemoveContainer" containerID="7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" Dec 03 19:59:40.579364 master-0 kubenswrapper[9368]: I1203 19:59:40.579254 9368 scope.go:117] "RemoveContainer" containerID="341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229" Dec 03 19:59:40.579977 master-0 kubenswrapper[9368]: I1203 19:59:40.579889 9368 scope.go:117] "RemoveContainer" containerID="654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee" Dec 03 19:59:40.580282 master-0 kubenswrapper[9368]: I1203 19:59:40.580245 9368 scope.go:117] "RemoveContainer" containerID="e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5" Dec 03 19:59:40.580630 master-0 kubenswrapper[9368]: I1203 19:59:40.580585 9368 scope.go:117] "RemoveContainer" containerID="0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7" Dec 03 19:59:40.581247 master-0 kubenswrapper[9368]: I1203 19:59:40.581111 9368 scope.go:117] "RemoveContainer" containerID="16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6" Dec 03 19:59:40.581247 master-0 kubenswrapper[9368]: I1203 19:59:40.581176 9368 scope.go:117] "RemoveContainer" containerID="65f13f5f310f6f953b71a1a783c24c03bd5eb6d2106c3ba74515208177e8e054" Dec 03 19:59:40.583212 master-0 kubenswrapper[9368]: I1203 19:59:40.583165 9368 scope.go:117] "RemoveContainer" containerID="0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b" Dec 03 19:59:40.583679 master-0 kubenswrapper[9368]: I1203 19:59:40.583632 9368 scope.go:117] "RemoveContainer" containerID="46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0" Dec 03 19:59:40.583834 master-0 kubenswrapper[9368]: I1203 19:59:40.583747 9368 scope.go:117] "RemoveContainer" containerID="026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd" Dec 03 19:59:40.585962 master-0 kubenswrapper[9368]: I1203 19:59:40.585874 9368 scope.go:117] "RemoveContainer" containerID="744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2" Dec 03 19:59:40.586081 master-0 kubenswrapper[9368]: I1203 19:59:40.585977 9368 scope.go:117] "RemoveContainer" containerID="49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75" Dec 03 19:59:40.586081 master-0 kubenswrapper[9368]: I1203 19:59:40.586044 9368 scope.go:117] "RemoveContainer" containerID="441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249" Dec 03 19:59:40.586269 master-0 kubenswrapper[9368]: I1203 19:59:40.586243 9368 scope.go:117] "RemoveContainer" containerID="95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8" Dec 03 19:59:40.587680 master-0 kubenswrapper[9368]: I1203 19:59:40.587619 9368 scope.go:117] "RemoveContainer" containerID="3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2" Dec 03 19:59:40.593646 master-0 kubenswrapper[9368]: I1203 19:59:40.592804 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 19:59:40.665250 master-0 kubenswrapper[9368]: E1203 19:59:40.665167 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 19:59:40.911302 master-0 kubenswrapper[9368]: I1203 19:59:40.911253 9368 generic.go:334] "Generic (PLEG): container finished" podID="d210062f-c07e-419f-a551-c37571565686" containerID="2d7be3731fbc745283a2d759f396c31ac1367c0ba714305c646e32b354747fdc" exitCode=0 Dec 03 19:59:41.286051 master-0 kubenswrapper[9368]: E1203 19:59:41.285899 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:59:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:59:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:59:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T19:59:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (patch nodes master-0)" Dec 03 19:59:41.309334 master-0 kubenswrapper[9368]: I1203 19:59:41.309299 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_9afe01c7-825c-43d1-8425-0317cdde11d6/installer/0.log" Dec 03 19:59:41.309499 master-0 kubenswrapper[9368]: I1203 19:59:41.309361 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:59:41.322584 master-0 kubenswrapper[9368]: I1203 19:59:41.322555 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:59:41.339391 master-0 kubenswrapper[9368]: I1203 19:59:41.338165 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_186cc14f-5f58-43ca-8ffa-db07606ff0f7/installer/0.log" Dec 03 19:59:41.339391 master-0 kubenswrapper[9368]: I1203 19:59:41.338231 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:59:41.491145 master-0 kubenswrapper[9368]: I1203 19:59:41.491086 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access\") pod \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " Dec 03 19:59:41.491145 master-0 kubenswrapper[9368]: I1203 19:59:41.491143 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir\") pod \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491200 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock\") pod \"9afe01c7-825c-43d1-8425-0317cdde11d6\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491220 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock\") pod \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491234 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir\") pod \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491254 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir\") pod \"9afe01c7-825c-43d1-8425-0317cdde11d6\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491301 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access\") pod \"9afe01c7-825c-43d1-8425-0317cdde11d6\" (UID: \"9afe01c7-825c-43d1-8425-0317cdde11d6\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491329 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock\") pod \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\" (UID: \"ce4afc7a-a338-4a2c-bada-22d4bac75d49\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491354 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access\") pod \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\" (UID: \"186cc14f-5f58-43ca-8ffa-db07606ff0f7\") " Dec 03 19:59:41.491373 master-0 kubenswrapper[9368]: I1203 19:59:41.491351 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock" (OuterVolumeSpecName: "var-lock") pod "9afe01c7-825c-43d1-8425-0317cdde11d6" (UID: "9afe01c7-825c-43d1-8425-0317cdde11d6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491601 master-0 kubenswrapper[9368]: I1203 19:59:41.491416 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock" (OuterVolumeSpecName: "var-lock") pod "ce4afc7a-a338-4a2c-bada-22d4bac75d49" (UID: "ce4afc7a-a338-4a2c-bada-22d4bac75d49"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491601 master-0 kubenswrapper[9368]: I1203 19:59:41.491407 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "186cc14f-5f58-43ca-8ffa-db07606ff0f7" (UID: "186cc14f-5f58-43ca-8ffa-db07606ff0f7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491601 master-0 kubenswrapper[9368]: I1203 19:59:41.491442 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock" (OuterVolumeSpecName: "var-lock") pod "186cc14f-5f58-43ca-8ffa-db07606ff0f7" (UID: "186cc14f-5f58-43ca-8ffa-db07606ff0f7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491601 master-0 kubenswrapper[9368]: I1203 19:59:41.491446 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ce4afc7a-a338-4a2c-bada-22d4bac75d49" (UID: "ce4afc7a-a338-4a2c-bada-22d4bac75d49"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491601 master-0 kubenswrapper[9368]: I1203 19:59:41.491425 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9afe01c7-825c-43d1-8425-0317cdde11d6" (UID: "9afe01c7-825c-43d1-8425-0317cdde11d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491610 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491628 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491639 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491651 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9afe01c7-825c-43d1-8425-0317cdde11d6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491662 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ce4afc7a-a338-4a2c-bada-22d4bac75d49-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.491742 master-0 kubenswrapper[9368]: I1203 19:59:41.491674 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.494268 master-0 kubenswrapper[9368]: I1203 19:59:41.494194 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ce4afc7a-a338-4a2c-bada-22d4bac75d49" (UID: "ce4afc7a-a338-4a2c-bada-22d4bac75d49"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:59:41.494333 master-0 kubenswrapper[9368]: I1203 19:59:41.494281 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9afe01c7-825c-43d1-8425-0317cdde11d6" (UID: "9afe01c7-825c-43d1-8425-0317cdde11d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:59:41.495360 master-0 kubenswrapper[9368]: I1203 19:59:41.495327 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "186cc14f-5f58-43ca-8ffa-db07606ff0f7" (UID: "186cc14f-5f58-43ca-8ffa-db07606ff0f7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:59:41.592512 master-0 kubenswrapper[9368]: I1203 19:59:41.592338 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9afe01c7-825c-43d1-8425-0317cdde11d6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.592512 master-0 kubenswrapper[9368]: I1203 19:59:41.592384 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/186cc14f-5f58-43ca-8ffa-db07606ff0f7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.592512 master-0 kubenswrapper[9368]: I1203 19:59:41.592403 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce4afc7a-a338-4a2c-bada-22d4bac75d49-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 19:59:41.936537 master-0 kubenswrapper[9368]: I1203 19:59:41.936454 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/1.log" Dec 03 19:59:41.939933 master-0 kubenswrapper[9368]: I1203 19:59:41.939772 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 19:59:41.942098 master-0 kubenswrapper[9368]: I1203 19:59:41.942056 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_186cc14f-5f58-43ca-8ffa-db07606ff0f7/installer/0.log" Dec 03 19:59:41.942370 master-0 kubenswrapper[9368]: I1203 19:59:41.942293 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 19:59:41.945517 master-0 kubenswrapper[9368]: I1203 19:59:41.945463 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/manager/0.log" Dec 03 19:59:41.949620 master-0 kubenswrapper[9368]: I1203 19:59:41.949563 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 19:59:41.954123 master-0 kubenswrapper[9368]: I1203 19:59:41.954073 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 19:59:41.959250 master-0 kubenswrapper[9368]: I1203 19:59:41.959199 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 19:59:41.962762 master-0 kubenswrapper[9368]: I1203 19:59:41.962705 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 19:59:41.966438 master-0 kubenswrapper[9368]: I1203 19:59:41.966308 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 19:59:41.969713 master-0 kubenswrapper[9368]: I1203 19:59:41.969668 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 19:59:41.970649 master-0 kubenswrapper[9368]: I1203 19:59:41.970597 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 19:59:41.972617 master-0 kubenswrapper[9368]: I1203 19:59:41.972586 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 19:59:41.977718 master-0 kubenswrapper[9368]: I1203 19:59:41.977661 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_9afe01c7-825c-43d1-8425-0317cdde11d6/installer/0.log" Dec 03 19:59:41.977910 master-0 kubenswrapper[9368]: I1203 19:59:41.977805 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 19:59:41.980052 master-0 kubenswrapper[9368]: I1203 19:59:41.980005 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 19:59:41.982511 master-0 kubenswrapper[9368]: I1203 19:59:41.982458 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r2kpn_c4d45235-fb1a-4626-a41e-b1e34f7bf76e/approver/0.log" Dec 03 19:59:41.984580 master-0 kubenswrapper[9368]: I1203 19:59:41.984523 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 19:59:41.986424 master-0 kubenswrapper[9368]: I1203 19:59:41.986392 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 19:59:41.988281 master-0 kubenswrapper[9368]: I1203 19:59:41.988209 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 19:59:41.990063 master-0 kubenswrapper[9368]: I1203 19:59:41.989894 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 19:59:41.992159 master-0 kubenswrapper[9368]: I1203 19:59:41.992114 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-vkcnf_73b7027e-44f5-4c7b-9226-585a90530535/manager/0.log" Dec 03 19:59:51.286837 master-0 kubenswrapper[9368]: E1203 19:59:51.286731 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:59:53.591126 master-0 kubenswrapper[9368]: E1203 19:59:53.591025 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 19:59:53.595037 master-0 kubenswrapper[9368]: E1203 19:59:53.594973 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 19:59:54.084418 master-0 kubenswrapper[9368]: I1203 19:59:54.084297 9368 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="3c8f577be66a40b37f0664a12c17056548ea3c9d36cd14f671ca30ad04cfd997" exitCode=0 Dec 03 19:59:57.338093 master-0 kubenswrapper[9368]: I1203 19:59:57.337990 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:59:57.339056 master-0 kubenswrapper[9368]: I1203 19:59:57.338107 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:59:57.666381 master-0 kubenswrapper[9368]: E1203 19:59:57.666258 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:00:00.680674 master-0 kubenswrapper[9368]: E1203 20:00:00.680530 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-baremetal-operator-5fdc576499-q47xb.187dccd4b77a0646 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-baremetal-operator-5fdc576499-q47xb,UID:433c3273-c99e-4d68-befc-06f92d2fc8d5,APIVersion:v1,ResourceVersion:7863,FieldPath:spec.containers{cluster-baremetal-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\" in 44.671s (44.671s including waiting). Image size: 465302163 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:22.982463046 +0000 UTC m=+108.643712957,LastTimestamp:2025-12-03 19:57:22.982463046 +0000 UTC m=+108.643712957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:00:01.287508 master-0 kubenswrapper[9368]: E1203 20:00:01.287438 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:00:10.933080 master-0 kubenswrapper[9368]: I1203 20:00:10.932936 9368 status_manager.go:851] "Failed to get status for pod" podUID="41b95a38663dd6fe34e183818a475977" pod="openshift-etcd/etcd-master-0-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0-master-0)" Dec 03 20:00:11.217658 master-0 kubenswrapper[9368]: I1203 20:00:11.217505 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/1.log" Dec 03 20:00:11.218678 master-0 kubenswrapper[9368]: I1203 20:00:11.218631 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 20:00:11.218927 master-0 kubenswrapper[9368]: I1203 20:00:11.218892 9368 generic.go:334] "Generic (PLEG): container finished" podID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" containerID="5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45" exitCode=1 Dec 03 20:00:11.289445 master-0 kubenswrapper[9368]: E1203 20:00:11.289273 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:00:14.596073 master-0 kubenswrapper[9368]: E1203 20:00:14.596019 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:00:14.597129 master-0 kubenswrapper[9368]: I1203 20:00:14.597033 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:00:14.668516 master-0 kubenswrapper[9368]: E1203 20:00:14.668400 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:00:21.289892 master-0 kubenswrapper[9368]: E1203 20:00:21.289754 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:00:21.289892 master-0 kubenswrapper[9368]: E1203 20:00:21.289841 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:00:27.338508 master-0 kubenswrapper[9368]: I1203 20:00:27.338385 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:00:27.338508 master-0 kubenswrapper[9368]: I1203 20:00:27.338495 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:00:31.670523 master-0 kubenswrapper[9368]: E1203 20:00:31.670423 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:00:34.683944 master-0 kubenswrapper[9368]: E1203 20:00:34.683706 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-autoscaler-operator-7f88444875-kqfs4.187dccd4b77bba43 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-autoscaler-operator-7f88444875-kqfs4,UID:b2021db5-b27a-4e06-beec-d9ba82aa1ffc,APIVersion:v1,ResourceVersion:7926,FieldPath:spec.containers{cluster-autoscaler-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2d41c3e944e86b73b4ba0d037ff016562211988f3206b9deb6cc7dccca708248\" in 44.126s (44.126s including waiting). Image size: 450855746 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:22.982574659 +0000 UTC m=+108.643824600,LastTimestamp:2025-12-03 19:57:22.982574659 +0000 UTC m=+108.643824600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:00:37.781803 master-0 kubenswrapper[9368]: I1203 20:00:37.781707 9368 scope.go:117] "RemoveContainer" containerID="8bcfa4660c84f8191cb52e8becfb5db2481eb6ba813d896bb3f747ba456753f9" Dec 03 20:00:41.360728 master-0 kubenswrapper[9368]: E1203 20:00:41.360630 9368 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 20:00:41.360728 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1" Netns:"/var/run/netns/4247f1ed-2528-4c53-9b6d-297ca4c9d74c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:00:41.360728 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:00:41.360728 master-0 kubenswrapper[9368]: > Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: E1203 20:00:41.360765 9368 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1" Netns:"/var/run/netns/4247f1ed-2528-4c53-9b6d-297ca4c9d74c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: E1203 20:00:41.360814 9368 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1" Netns:"/var/run/netns/4247f1ed-2528-4c53-9b6d-297ca4c9d74c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:00:41.361526 master-0 kubenswrapper[9368]: E1203 20:00:41.360928 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1\\\" Netns:\\\"/var/run/netns/4247f1ed-2528-4c53-9b6d-297ca4c9d74c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=df8fad6622c5b68333e6a7dc6fc7fd75b813deb8038bd4213f5ec15d61f11ab1;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" Dec 03 20:00:41.441593 master-0 kubenswrapper[9368]: I1203 20:00:41.441547 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/1.log" Dec 03 20:00:41.442261 master-0 kubenswrapper[9368]: I1203 20:00:41.442234 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 20:00:41.442322 master-0 kubenswrapper[9368]: I1203 20:00:41.442276 9368 generic.go:334] "Generic (PLEG): container finished" podID="433c3273-c99e-4d68-befc-06f92d2fc8d5" containerID="92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322" exitCode=1 Dec 03 20:00:41.670972 master-0 kubenswrapper[9368]: E1203 20:00:41.670812 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:00:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:00:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:00:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:00:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:00:42.453693 master-0 kubenswrapper[9368]: I1203 20:00:42.453561 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" exitCode=1 Dec 03 20:00:48.600950 master-0 kubenswrapper[9368]: E1203 20:00:48.600741 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: E1203 20:00:48.601030 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.025s" Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: I1203 20:00:48.601154 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: I1203 20:00:48.601180 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sp868" Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: I1203 20:00:48.601199 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" event={"ID":"b5cad72f-5bbf-42fc-9d63-545a01c98cbe","Type":"ContainerDied","Data":"c4dc9f4dd5e88018642a46232bff77d5e6ea06620de4db64db7e71c41383a65d"} Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: I1203 20:00:48.601353 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:00:48.602158 master-0 kubenswrapper[9368]: I1203 20:00:48.601582 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:00:48.603102 master-0 kubenswrapper[9368]: I1203 20:00:48.603036 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:00:48.603291 master-0 kubenswrapper[9368]: I1203 20:00:48.603242 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:00:48.603371 master-0 kubenswrapper[9368]: I1203 20:00:48.603286 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerDied","Data":"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57"} Dec 03 20:00:48.603371 master-0 kubenswrapper[9368]: I1203 20:00:48.603315 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerDied","Data":"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee"} Dec 03 20:00:48.603371 master-0 kubenswrapper[9368]: I1203 20:00:48.603339 9368 scope.go:117] "RemoveContainer" containerID="4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" Dec 03 20:00:48.604136 master-0 kubenswrapper[9368]: I1203 20:00:48.604078 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:00:48.604642 master-0 kubenswrapper[9368]: I1203 20:00:48.604595 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:00:48.605481 master-0 kubenswrapper[9368]: I1203 20:00:48.605413 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:00:48.605601 master-0 kubenswrapper[9368]: I1203 20:00:48.605567 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:00:48.605741 master-0 kubenswrapper[9368]: I1203 20:00:48.605678 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz" event={"ID":"61ca5373-413c-4824-ba19-13b99c3081e4","Type":"ContainerDied","Data":"aea6f3a9262e629da79db6eb6db6e7fa4b11e6388c15a6c5cfabb34f955bd062"} Dec 03 20:00:48.605873 master-0 kubenswrapper[9368]: I1203 20:00:48.605737 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ce4afc7a-a338-4a2c-bada-22d4bac75d49","Type":"ContainerDied","Data":"6734488c6ce6905e5e770b668e83066dd3b8267a0d3cf0d97567edcd50a10461"} Dec 03 20:00:48.606694 master-0 kubenswrapper[9368]: I1203 20:00:48.606247 9368 scope.go:117] "RemoveContainer" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:00:48.606694 master-0 kubenswrapper[9368]: E1203 20:00:48.606606 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:00:48.619709 master-0 kubenswrapper[9368]: I1203 20:00:48.619628 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:00:48.628913 master-0 kubenswrapper[9368]: I1203 20:00:48.628854 9368 scope.go:117] "RemoveContainer" containerID="3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" Dec 03 20:00:48.645807 master-0 kubenswrapper[9368]: I1203 20:00:48.645764 9368 scope.go:117] "RemoveContainer" containerID="4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" Dec 03 20:00:48.646419 master-0 kubenswrapper[9368]: E1203 20:00:48.646349 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57\": container with ID starting with 4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57 not found: ID does not exist" containerID="4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" Dec 03 20:00:48.646479 master-0 kubenswrapper[9368]: I1203 20:00:48.646435 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57"} err="failed to get container status \"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57\": rpc error: code = NotFound desc = could not find container \"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57\": container with ID starting with 4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57 not found: ID does not exist" Dec 03 20:00:48.646724 master-0 kubenswrapper[9368]: I1203 20:00:48.646477 9368 scope.go:117] "RemoveContainer" containerID="3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" Dec 03 20:00:48.647150 master-0 kubenswrapper[9368]: E1203 20:00:48.647115 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee\": container with ID starting with 3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee not found: ID does not exist" containerID="3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" Dec 03 20:00:48.647210 master-0 kubenswrapper[9368]: I1203 20:00:48.647152 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee"} err="failed to get container status \"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee\": rpc error: code = NotFound desc = could not find container \"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee\": container with ID starting with 3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee not found: ID does not exist" Dec 03 20:00:48.647210 master-0 kubenswrapper[9368]: I1203 20:00:48.647177 9368 scope.go:117] "RemoveContainer" containerID="4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57" Dec 03 20:00:48.647605 master-0 kubenswrapper[9368]: I1203 20:00:48.647529 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57"} err="failed to get container status \"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57\": rpc error: code = NotFound desc = could not find container \"4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57\": container with ID starting with 4084aa8b9bc23d75642df1ea978e90c69ad65ead87bc335cb6c061e4184c4f57 not found: ID does not exist" Dec 03 20:00:48.647605 master-0 kubenswrapper[9368]: I1203 20:00:48.647550 9368 scope.go:117] "RemoveContainer" containerID="3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee" Dec 03 20:00:48.648083 master-0 kubenswrapper[9368]: I1203 20:00:48.648030 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee"} err="failed to get container status \"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee\": rpc error: code = NotFound desc = could not find container \"3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee\": container with ID starting with 3b0aba4add3cc1310a8315895b5136f2d6481203591c17752e5cefa8d38657ee not found: ID does not exist" Dec 03 20:00:48.670983 master-0 kubenswrapper[9368]: E1203 20:00:48.670878 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:00:51.672085 master-0 kubenswrapper[9368]: E1203 20:00:51.671913 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:01:01.672635 master-0 kubenswrapper[9368]: E1203 20:01:01.672522 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:01:05.671826 master-0 kubenswrapper[9368]: E1203 20:01:05.671728 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:01:08.686712 master-0 kubenswrapper[9368]: E1203 20:01:08.686413 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-5775bfbf6d-psrtz.187dccd4b77dadb0 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-5775bfbf6d-psrtz,UID:61ca5373-413c-4824-ba19-13b99c3081e4,APIVersion:v1,ResourceVersion:7766,FieldPath:spec.containers{machine-approver-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9f4724570795357eb097251a021f20c94c79b3054f3adb3bc0812143ba791dc1\" in 44.947s (44.947s including waiting). Image size: 461716546 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:22.982702512 +0000 UTC m=+108.643952423,LastTimestamp:2025-12-03 19:57:22.982702512 +0000 UTC m=+108.643952423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:01:10.934541 master-0 kubenswrapper[9368]: I1203 20:01:10.934421 9368 status_manager.go:851] "Failed to get status for pod" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" pod="openshift-marketplace/community-operators-r2c8x" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods community-operators-r2c8x)" Dec 03 20:01:11.671433 master-0 kubenswrapper[9368]: I1203 20:01:11.671393 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/2.log" Dec 03 20:01:11.671963 master-0 kubenswrapper[9368]: I1203 20:01:11.671940 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 20:01:11.673005 master-0 kubenswrapper[9368]: I1203 20:01:11.672935 9368 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f" exitCode=255 Dec 03 20:01:11.673449 master-0 kubenswrapper[9368]: E1203 20:01:11.673241 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:01:11.677094 master-0 kubenswrapper[9368]: I1203 20:01:11.677049 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/2.log" Dec 03 20:01:11.677888 master-0 kubenswrapper[9368]: I1203 20:01:11.677859 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 20:01:11.678439 master-0 kubenswrapper[9368]: I1203 20:01:11.678423 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 20:01:11.678549 master-0 kubenswrapper[9368]: I1203 20:01:11.678527 9368 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12" exitCode=255 Dec 03 20:01:12.687303 master-0 kubenswrapper[9368]: I1203 20:01:12.687270 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/2.log" Dec 03 20:01:12.688593 master-0 kubenswrapper[9368]: I1203 20:01:12.688547 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 20:01:12.689114 master-0 kubenswrapper[9368]: I1203 20:01:12.689092 9368 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341" exitCode=255 Dec 03 20:01:12.691256 master-0 kubenswrapper[9368]: I1203 20:01:12.691241 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/2.log" Dec 03 20:01:12.691839 master-0 kubenswrapper[9368]: I1203 20:01:12.691820 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 20:01:12.692578 master-0 kubenswrapper[9368]: I1203 20:01:12.692536 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8" exitCode=255 Dec 03 20:01:12.694688 master-0 kubenswrapper[9368]: I1203 20:01:12.694673 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/2.log" Dec 03 20:01:12.695433 master-0 kubenswrapper[9368]: I1203 20:01:12.695399 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/1.log" Dec 03 20:01:12.695899 master-0 kubenswrapper[9368]: I1203 20:01:12.695882 9368 generic.go:334] "Generic (PLEG): container finished" podID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" containerID="fbb527c9a5f9ae83b24668268584afb30442540a16ac4e78c92bdf23a3df3b8c" exitCode=255 Dec 03 20:01:12.698263 master-0 kubenswrapper[9368]: I1203 20:01:12.698229 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/1.log" Dec 03 20:01:12.698961 master-0 kubenswrapper[9368]: I1203 20:01:12.698934 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 20:01:12.699155 master-0 kubenswrapper[9368]: I1203 20:01:12.699124 9368 generic.go:334] "Generic (PLEG): container finished" podID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" containerID="c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063" exitCode=255 Dec 03 20:01:12.701468 master-0 kubenswrapper[9368]: I1203 20:01:12.701431 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/2.log" Dec 03 20:01:12.702027 master-0 kubenswrapper[9368]: I1203 20:01:12.701994 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 20:01:12.702677 master-0 kubenswrapper[9368]: I1203 20:01:12.702645 9368 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766" exitCode=255 Dec 03 20:01:12.704442 master-0 kubenswrapper[9368]: I1203 20:01:12.704401 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/2.log" Dec 03 20:01:12.705094 master-0 kubenswrapper[9368]: I1203 20:01:12.705059 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 20:01:12.705666 master-0 kubenswrapper[9368]: I1203 20:01:12.705620 9368 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38" exitCode=255 Dec 03 20:01:12.707546 master-0 kubenswrapper[9368]: I1203 20:01:12.707491 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/2.log" Dec 03 20:01:12.708118 master-0 kubenswrapper[9368]: I1203 20:01:12.708083 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 20:01:12.708600 master-0 kubenswrapper[9368]: I1203 20:01:12.708560 9368 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a" exitCode=255 Dec 03 20:01:12.710572 master-0 kubenswrapper[9368]: I1203 20:01:12.710517 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/2.log" Dec 03 20:01:12.711198 master-0 kubenswrapper[9368]: I1203 20:01:12.711146 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 20:01:12.711632 master-0 kubenswrapper[9368]: I1203 20:01:12.711593 9368 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a" exitCode=255 Dec 03 20:01:21.673953 master-0 kubenswrapper[9368]: E1203 20:01:21.673877 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Dec 03 20:01:21.675010 master-0 kubenswrapper[9368]: E1203 20:01:21.674552 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:01:22.622826 master-0 kubenswrapper[9368]: E1203 20:01:22.622688 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 20:01:22.622826 master-0 kubenswrapper[9368]: I1203 20:01:22.622747 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:01:22.674001 master-0 kubenswrapper[9368]: E1203 20:01:22.673892 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:01:39.674980 master-0 kubenswrapper[9368]: E1203 20:01:39.674850 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:01:41.923322 master-0 kubenswrapper[9368]: E1203 20:01:41.922906 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:01:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:01:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:01:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:01:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:01:41.943028 master-0 kubenswrapper[9368]: I1203 20:01:41.942953 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/1.log" Dec 03 20:01:41.944629 master-0 kubenswrapper[9368]: I1203 20:01:41.944569 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 20:01:41.944767 master-0 kubenswrapper[9368]: I1203 20:01:41.944640 9368 generic.go:334] "Generic (PLEG): container finished" podID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" containerID="739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998" exitCode=1 Dec 03 20:01:41.948097 master-0 kubenswrapper[9368]: I1203 20:01:41.948010 9368 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52" exitCode=0 Dec 03 20:01:42.689543 master-0 kubenswrapper[9368]: E1203 20:01:42.689284 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-samples-operator-6d64b47964-h9nkv.187dccd4b77f95f1 openshift-cluster-samples-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-samples-operator,Name:cluster-samples-operator-6d64b47964-h9nkv,UID:6a82ff78-4383-4ca8-8a72-98c2ee50ffe2,APIVersion:v1,ResourceVersion:7843,FieldPath:spec.containers{cluster-samples-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:912759ba49a70e63f7585b351b1deed008b5815d275f478f052c8c2880101d3c\" in 44.802s (44.802s including waiting). Image size: 449985691 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:22.982827505 +0000 UTC m=+108.644077416,LastTimestamp:2025-12-03 19:57:22.982827505 +0000 UTC m=+108.644077416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:01:51.924483 master-0 kubenswrapper[9368]: E1203 20:01:51.924406 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:01:56.625923 master-0 kubenswrapper[9368]: E1203 20:01:56.625769 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:01:56.626563 master-0 kubenswrapper[9368]: E1203 20:01:56.626110 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.02s" Dec 03 20:01:56.626563 master-0 kubenswrapper[9368]: I1203 20:01:56.626154 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:01:56.626704 master-0 kubenswrapper[9368]: I1203 20:01:56.626630 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:01:56.627384 master-0 kubenswrapper[9368]: I1203 20:01:56.627082 9368 scope.go:117] "RemoveContainer" containerID="3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766" Dec 03 20:01:56.627628 master-0 kubenswrapper[9368]: I1203 20:01:56.627584 9368 scope.go:117] "RemoveContainer" containerID="2d7be3731fbc745283a2d759f396c31ac1367c0ba714305c646e32b354747fdc" Dec 03 20:01:56.629984 master-0 kubenswrapper[9368]: I1203 20:01:56.629931 9368 scope.go:117] "RemoveContainer" containerID="81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12" Dec 03 20:01:56.631070 master-0 kubenswrapper[9368]: I1203 20:01:56.630916 9368 scope.go:117] "RemoveContainer" containerID="fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38" Dec 03 20:01:56.631124 master-0 kubenswrapper[9368]: I1203 20:01:56.631066 9368 scope.go:117] "RemoveContainer" containerID="4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8" Dec 03 20:01:56.631174 master-0 kubenswrapper[9368]: I1203 20:01:56.631145 9368 scope.go:117] "RemoveContainer" containerID="8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a" Dec 03 20:01:56.631717 master-0 kubenswrapper[9368]: I1203 20:01:56.631653 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:01:56.631876 master-0 kubenswrapper[9368]: I1203 20:01:56.631844 9368 scope.go:117] "RemoveContainer" containerID="92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322" Dec 03 20:01:56.633971 master-0 kubenswrapper[9368]: I1203 20:01:56.632506 9368 scope.go:117] "RemoveContainer" containerID="c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a" Dec 03 20:01:56.634915 master-0 kubenswrapper[9368]: I1203 20:01:56.634454 9368 scope.go:117] "RemoveContainer" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:01:56.636180 master-0 kubenswrapper[9368]: I1203 20:01:56.635165 9368 scope.go:117] "RemoveContainer" containerID="739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998" Dec 03 20:01:56.638012 master-0 kubenswrapper[9368]: I1203 20:01:56.637453 9368 scope.go:117] "RemoveContainer" containerID="1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341" Dec 03 20:01:56.638196 master-0 kubenswrapper[9368]: I1203 20:01:56.638046 9368 scope.go:117] "RemoveContainer" containerID="5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45" Dec 03 20:01:56.638830 master-0 kubenswrapper[9368]: I1203 20:01:56.638390 9368 scope.go:117] "RemoveContainer" containerID="c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063" Dec 03 20:01:56.638830 master-0 kubenswrapper[9368]: I1203 20:01:56.638719 9368 scope.go:117] "RemoveContainer" containerID="b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f" Dec 03 20:01:56.638830 master-0 kubenswrapper[9368]: I1203 20:01:56.638757 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56"} pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 20:01:56.639064 master-0 kubenswrapper[9368]: I1203 20:01:56.638882 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" containerID="cri-o://550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56" gracePeriod=600 Dec 03 20:01:56.657429 master-0 kubenswrapper[9368]: I1203 20:01:56.657395 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:01:56.676177 master-0 kubenswrapper[9368]: E1203 20:01:56.676126 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:01:57.055253 master-0 kubenswrapper[9368]: I1203 20:01:57.055218 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/2.log" Dec 03 20:01:57.055833 master-0 kubenswrapper[9368]: I1203 20:01:57.055811 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 20:01:57.058916 master-0 kubenswrapper[9368]: I1203 20:01:57.058865 9368 generic.go:334] "Generic (PLEG): container finished" podID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerID="550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56" exitCode=0 Dec 03 20:01:58.072724 master-0 kubenswrapper[9368]: I1203 20:01:58.072629 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/1.log" Dec 03 20:01:58.075883 master-0 kubenswrapper[9368]: I1203 20:01:58.075756 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 20:01:58.085383 master-0 kubenswrapper[9368]: I1203 20:01:58.085309 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/2.log" Dec 03 20:01:58.086749 master-0 kubenswrapper[9368]: I1203 20:01:58.086689 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 20:01:58.091648 master-0 kubenswrapper[9368]: I1203 20:01:58.091549 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/1.log" Dec 03 20:01:58.092622 master-0 kubenswrapper[9368]: I1203 20:01:58.092561 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 20:01:58.096714 master-0 kubenswrapper[9368]: I1203 20:01:58.096625 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/2.log" Dec 03 20:01:58.097590 master-0 kubenswrapper[9368]: I1203 20:01:58.097531 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 20:01:58.102822 master-0 kubenswrapper[9368]: I1203 20:01:58.102745 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/2.log" Dec 03 20:01:58.103626 master-0 kubenswrapper[9368]: I1203 20:01:58.103562 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 20:01:58.107574 master-0 kubenswrapper[9368]: I1203 20:01:58.107527 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/1.log" Dec 03 20:01:58.109114 master-0 kubenswrapper[9368]: I1203 20:01:58.109065 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 20:01:58.117172 master-0 kubenswrapper[9368]: I1203 20:01:58.117110 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/1.log" Dec 03 20:01:58.117978 master-0 kubenswrapper[9368]: I1203 20:01:58.117929 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 20:01:58.121934 master-0 kubenswrapper[9368]: I1203 20:01:58.121867 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/2.log" Dec 03 20:01:58.122957 master-0 kubenswrapper[9368]: I1203 20:01:58.122916 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 20:01:58.127040 master-0 kubenswrapper[9368]: I1203 20:01:58.127010 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/2.log" Dec 03 20:01:58.127739 master-0 kubenswrapper[9368]: I1203 20:01:58.127711 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 20:01:58.131248 master-0 kubenswrapper[9368]: I1203 20:01:58.131203 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/2.log" Dec 03 20:01:58.131977 master-0 kubenswrapper[9368]: I1203 20:01:58.131946 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 20:01:58.135608 master-0 kubenswrapper[9368]: I1203 20:01:58.135553 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/2.log" Dec 03 20:01:58.136263 master-0 kubenswrapper[9368]: I1203 20:01:58.136202 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 20:01:58.137113 master-0 kubenswrapper[9368]: I1203 20:01:58.137076 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 20:02:01.925616 master-0 kubenswrapper[9368]: E1203 20:02:01.925506 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:02:07.202853 master-0 kubenswrapper[9368]: I1203 20:02:07.202704 9368 generic.go:334] "Generic (PLEG): container finished" podID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" containerID="193ee1ad3e7ee183f1ea38494d7735760027689afd79629a8d160747a2494f67" exitCode=0 Dec 03 20:02:09.634051 master-0 kubenswrapper[9368]: E1203 20:02:09.633981 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:02:09.640163 master-0 kubenswrapper[9368]: E1203 20:02:09.640114 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 20:02:10.234536 master-0 kubenswrapper[9368]: I1203 20:02:10.234408 9368 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="7ffe9984ab39638ad7730b79c49181e26ef0a2e2748c84910693d2353db0a811" exitCode=0 Dec 03 20:02:10.936582 master-0 kubenswrapper[9368]: I1203 20:02:10.936533 9368 status_manager.go:851] "Failed to get status for pod" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" pod="openshift-marketplace/redhat-operators-6zrxk" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods redhat-operators-6zrxk)" Dec 03 20:02:11.928281 master-0 kubenswrapper[9368]: E1203 20:02:11.927995 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:02:13.254403 master-0 kubenswrapper[9368]: I1203 20:02:13.254217 9368 generic.go:334] "Generic (PLEG): container finished" podID="63e3d36d-1676-4f90-ac9a-d85b861a4655" containerID="59561622c420df151d8043e444eaec7dca0c22e244b1a6ac8880f20fe809e5c4" exitCode=0 Dec 03 20:02:13.257853 master-0 kubenswrapper[9368]: I1203 20:02:13.257745 9368 generic.go:334] "Generic (PLEG): container finished" podID="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" containerID="33fc3458349b78bc19c8b30395e299c49cdfbf37f7e541929fe27fba4fc59440" exitCode=0 Dec 03 20:02:13.677214 master-0 kubenswrapper[9368]: E1203 20:02:13.677062 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:02:16.692161 master-0 kubenswrapper[9368]: E1203 20:02:16.691885 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{route-controller-manager-869d689b5b-brqck.187dccd4bcedcfa7 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-869d689b5b-brqck,UID:b5cad72f-5bbf-42fc-9d63-545a01c98cbe,APIVersion:v1,ResourceVersion:7485,FieldPath:spec.containers{route-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\" in 44.973s (44.973s including waiting). Image size: 481573011 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.073937319 +0000 UTC m=+108.735187230,LastTimestamp:2025-12-03 19:57:23.073937319 +0000 UTC m=+108.735187230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:02:19.305725 master-0 kubenswrapper[9368]: I1203 20:02:19.305604 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/0.log" Dec 03 20:02:19.306579 master-0 kubenswrapper[9368]: I1203 20:02:19.306436 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="2dd513c4c7700ec665cd85658968cfa47ab585f4855779f0285e2f319e1b23ec" exitCode=0 Dec 03 20:02:19.853519 master-0 kubenswrapper[9368]: I1203 20:02:19.853419 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:19.853519 master-0 kubenswrapper[9368]: I1203 20:02:19.853510 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:20.664461 master-0 kubenswrapper[9368]: I1203 20:02:20.664317 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:20.664461 master-0 kubenswrapper[9368]: I1203 20:02:20.664402 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:21.929006 master-0 kubenswrapper[9368]: E1203 20:02:21.928870 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:02:21.929006 master-0 kubenswrapper[9368]: E1203 20:02:21.928931 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:02:22.853746 master-0 kubenswrapper[9368]: I1203 20:02:22.853649 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:22.854241 master-0 kubenswrapper[9368]: I1203 20:02:22.853767 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:23.663805 master-0 kubenswrapper[9368]: I1203 20:02:23.663704 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:23.664601 master-0 kubenswrapper[9368]: I1203 20:02:23.663815 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:24.348192 master-0 kubenswrapper[9368]: I1203 20:02:24.348062 9368 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="9936bd164d7a83dfd6c86c4312838d63181895add63b7d1de35a090b8b7d369b" exitCode=0 Dec 03 20:02:24.351366 master-0 kubenswrapper[9368]: I1203 20:02:24.351301 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-vqzdb_7ed25861-1328-45e7-922e-37588a0b019c/cluster-node-tuning-operator/0.log" Dec 03 20:02:24.351520 master-0 kubenswrapper[9368]: I1203 20:02:24.351382 9368 generic.go:334] "Generic (PLEG): container finished" podID="7ed25861-1328-45e7-922e-37588a0b019c" containerID="b15d5b3401a95a50f5c18b6410300731cd922d460a927b29c822856e4c00523b" exitCode=1 Dec 03 20:02:25.853876 master-0 kubenswrapper[9368]: I1203 20:02:25.853763 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:25.853876 master-0 kubenswrapper[9368]: I1203 20:02:25.853866 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:26.664169 master-0 kubenswrapper[9368]: I1203 20:02:26.664071 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:26.664484 master-0 kubenswrapper[9368]: I1203 20:02:26.664186 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:27.378926 master-0 kubenswrapper[9368]: I1203 20:02:27.378857 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/2.log" Dec 03 20:02:27.379739 master-0 kubenswrapper[9368]: I1203 20:02:27.379363 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/1.log" Dec 03 20:02:27.379959 master-0 kubenswrapper[9368]: I1203 20:02:27.379915 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 20:02:27.380059 master-0 kubenswrapper[9368]: I1203 20:02:27.379986 9368 generic.go:334] "Generic (PLEG): container finished" podID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" containerID="1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c" exitCode=1 Dec 03 20:02:28.391415 master-0 kubenswrapper[9368]: I1203 20:02:28.391293 9368 generic.go:334] "Generic (PLEG): container finished" podID="5decce88-c71e-411c-87b5-a37dd0f77e7b" containerID="ce3971a00b14ee7d8820c7e2ce38f070172641049e39dce3eb3a076d83a464ea" exitCode=0 Dec 03 20:02:28.852913 master-0 kubenswrapper[9368]: I1203 20:02:28.852673 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:28.852913 master-0 kubenswrapper[9368]: I1203 20:02:28.852755 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:30.661320 master-0 kubenswrapper[9368]: E1203 20:02:30.661199 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:02:30.661320 master-0 kubenswrapper[9368]: I1203 20:02:30.661260 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:02:30.679026 master-0 kubenswrapper[9368]: E1203 20:02:30.678586 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:02:31.853595 master-0 kubenswrapper[9368]: I1203 20:02:31.853461 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:31.853595 master-0 kubenswrapper[9368]: I1203 20:02:31.853526 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:32.429289 master-0 kubenswrapper[9368]: I1203 20:02:32.429183 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc" exitCode=0 Dec 03 20:02:34.853590 master-0 kubenswrapper[9368]: I1203 20:02:34.853476 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:34.853590 master-0 kubenswrapper[9368]: I1203 20:02:34.853554 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:36.525764 master-0 kubenswrapper[9368]: I1203 20:02:36.525687 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:37.853104 master-0 kubenswrapper[9368]: I1203 20:02:37.853008 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:37.864053 master-0 kubenswrapper[9368]: I1203 20:02:37.853115 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:37.864053 master-0 kubenswrapper[9368]: I1203 20:02:37.861762 9368 scope.go:117] "RemoveContainer" containerID="52062cf7e28f06e4b78d834f54e665243402b015a9d5ef15880a1512af2a4c43" Dec 03 20:02:37.897702 master-0 kubenswrapper[9368]: I1203 20:02:37.897648 9368 scope.go:117] "RemoveContainer" containerID="d3dcff6d3aa1b038077193f459470aa3ca6e3833e6b52e5e7c49c67633f191e1" Dec 03 20:02:38.609764 master-0 kubenswrapper[9368]: I1203 20:02:38.609643 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:40.495426 master-0 kubenswrapper[9368]: I1203 20:02:40.495299 9368 generic.go:334] "Generic (PLEG): container finished" podID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerID="6f8d03455884710e737b779ab993de7b077a6712d61dd531eb926a20dcac48c1" exitCode=0 Dec 03 20:02:40.498877 master-0 kubenswrapper[9368]: I1203 20:02:40.498825 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-pqz7q_0d4e4f88-7106-4a46-8b63-053345922fb0/package-server-manager/0.log" Dec 03 20:02:40.499516 master-0 kubenswrapper[9368]: I1203 20:02:40.499454 9368 generic.go:334] "Generic (PLEG): container finished" podID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerID="2f3d798fc128d08f2b78c16a96552eb1af844c024c5ff08c6a9c3b2ad0da6b71" exitCode=1 Dec 03 20:02:41.013384 master-0 kubenswrapper[9368]: I1203 20:02:41.013314 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:41.013979 master-0 kubenswrapper[9368]: I1203 20:02:41.013389 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:41.409854 master-0 kubenswrapper[9368]: I1203 20:02:41.409323 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:02:41.409854 master-0 kubenswrapper[9368]: I1203 20:02:41.409359 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:02:41.409854 master-0 kubenswrapper[9368]: I1203 20:02:41.409417 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:02:41.409854 master-0 kubenswrapper[9368]: I1203 20:02:41.409461 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:02:42.322108 master-0 kubenswrapper[9368]: E1203 20:02:42.321559 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:02:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:02:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:02:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:02:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:02:43.854427 master-0 kubenswrapper[9368]: I1203 20:02:43.854292 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:43.854427 master-0 kubenswrapper[9368]: I1203 20:02:43.854404 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:46.526152 master-0 kubenswrapper[9368]: I1203 20:02:46.526029 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:46.854443 master-0 kubenswrapper[9368]: I1203 20:02:46.854239 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:46.854443 master-0 kubenswrapper[9368]: I1203 20:02:46.854336 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:47.680750 master-0 kubenswrapper[9368]: E1203 20:02:47.680609 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:02:48.609485 master-0 kubenswrapper[9368]: I1203 20:02:48.609341 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:49.853990 master-0 kubenswrapper[9368]: I1203 20:02:49.853900 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:49.853990 master-0 kubenswrapper[9368]: I1203 20:02:49.853982 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:50.695688 master-0 kubenswrapper[9368]: E1203 20:02:50.695481 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cloud-credential-operator-7c4dc67499-lqdlr.187dccd4bf86a628 openshift-cloud-credential-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cloud-credential-operator,Name:cloud-credential-operator-7c4dc67499-lqdlr,UID:6404bbc7-8ca9-4f20-8ce7-40f855555160,APIVersion:v1,ResourceVersion:7826,FieldPath:spec.containers{cloud-credential-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\" in 44.24s (44.24s including waiting). Image size: 874839630 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.117508136 +0000 UTC m=+108.778758057,LastTimestamp:2025-12-03 19:57:23.117508136 +0000 UTC m=+108.778758057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:02:51.409157 master-0 kubenswrapper[9368]: I1203 20:02:51.409038 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:02:51.409157 master-0 kubenswrapper[9368]: I1203 20:02:51.409130 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:02:51.410142 master-0 kubenswrapper[9368]: I1203 20:02:51.409057 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:02:51.410142 master-0 kubenswrapper[9368]: I1203 20:02:51.409404 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:02:52.323200 master-0 kubenswrapper[9368]: E1203 20:02:52.322749 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:02:52.853271 master-0 kubenswrapper[9368]: I1203 20:02:52.853192 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:52.853271 master-0 kubenswrapper[9368]: I1203 20:02:52.853253 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:55.853690 master-0 kubenswrapper[9368]: I1203 20:02:55.853587 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:55.854696 master-0 kubenswrapper[9368]: I1203 20:02:55.853698 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:02:56.526917 master-0 kubenswrapper[9368]: I1203 20:02:56.526830 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:57.434564 master-0 kubenswrapper[9368]: E1203 20:02:57.434499 9368 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 20:02:57.434564 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892" Netns:"/var/run/netns/73698bdc-d82e-4367-bc9b-caacc74a2eb8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:02:57.434564 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:02:57.434564 master-0 kubenswrapper[9368]: > Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: E1203 20:02:57.434597 9368 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892" Netns:"/var/run/netns/73698bdc-d82e-4367-bc9b-caacc74a2eb8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: E1203 20:02:57.434632 9368 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892" Netns:"/var/run/netns/73698bdc-d82e-4367-bc9b-caacc74a2eb8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: > pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:02:57.435154 master-0 kubenswrapper[9368]: E1203 20:02:57.434725 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-ff788744d-hkt6c_openshift-controller-manager(1c22cb59-5083-4be6-9998-a9e67a2c20cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-ff788744d-hkt6c_openshift-controller-manager_1c22cb59-5083-4be6-9998-a9e67a2c20cd_0(4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892): error adding pod openshift-controller-manager_controller-manager-ff788744d-hkt6c to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892\\\" Netns:\\\"/var/run/netns/73698bdc-d82e-4367-bc9b-caacc74a2eb8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-ff788744d-hkt6c;K8S_POD_INFRA_CONTAINER_ID=4159edcb3727883de5511576a33a3cacc1250f5f8682efe48fb02df8067e2892;K8S_POD_UID=1c22cb59-5083-4be6-9998-a9e67a2c20cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-ff788744d-hkt6c] networking: Multus: [openshift-controller-manager/controller-manager-ff788744d-hkt6c/1c22cb59-5083-4be6-9998-a9e67a2c20cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-ff788744d-hkt6c in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-ff788744d-hkt6c?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" Dec 03 20:02:57.631040 master-0 kubenswrapper[9368]: I1203 20:02:57.630997 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/2.log" Dec 03 20:02:57.631381 master-0 kubenswrapper[9368]: I1203 20:02:57.631360 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/1.log" Dec 03 20:02:57.632169 master-0 kubenswrapper[9368]: I1203 20:02:57.632148 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 20:02:57.632213 master-0 kubenswrapper[9368]: I1203 20:02:57.632180 9368 generic.go:334] "Generic (PLEG): container finished" podID="433c3273-c99e-4d68-befc-06f92d2fc8d5" containerID="0714d8c339d81fe37d65f8b61284fb17442521338c0d1beb9a6cde0e4b83dcaa" exitCode=1 Dec 03 20:02:58.609818 master-0 kubenswrapper[9368]: I1203 20:02:58.609628 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:02:58.644466 master-0 kubenswrapper[9368]: I1203 20:02:58.644378 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="ecf333f033fb5f8af44f74367011135c5c68151c236ed2fb6c9deb690a21c615" exitCode=1 Dec 03 20:02:58.853855 master-0 kubenswrapper[9368]: I1203 20:02:58.853664 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:02:58.854148 master-0 kubenswrapper[9368]: I1203 20:02:58.853882 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:01.409339 master-0 kubenswrapper[9368]: I1203 20:03:01.409244 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:03:01.409902 master-0 kubenswrapper[9368]: I1203 20:03:01.409350 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:03:01.409902 master-0 kubenswrapper[9368]: I1203 20:03:01.409492 9368 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-pqz7q container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 03 20:03:01.409902 master-0 kubenswrapper[9368]: I1203 20:03:01.409545 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" podUID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 03 20:03:01.853482 master-0 kubenswrapper[9368]: I1203 20:03:01.853269 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:01.853482 master-0 kubenswrapper[9368]: I1203 20:03:01.853410 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:02.323568 master-0 kubenswrapper[9368]: E1203 20:03:02.323470 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:03:04.664639 master-0 kubenswrapper[9368]: E1203 20:03:04.664543 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 20:03:04.665735 master-0 kubenswrapper[9368]: E1203 20:03:04.664718 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.038s" Dec 03 20:03:04.676042 master-0 kubenswrapper[9368]: I1203 20:03:04.675980 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:03:04.682808 master-0 kubenswrapper[9368]: E1203 20:03:04.682696 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:03:04.854119 master-0 kubenswrapper[9368]: I1203 20:03:04.854016 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:04.854404 master-0 kubenswrapper[9368]: I1203 20:03:04.854113 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:07.661772 master-0 kubenswrapper[9368]: I1203 20:03:07.661692 9368 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-mqpzf container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 03 20:03:07.662623 master-0 kubenswrapper[9368]: I1203 20:03:07.661867 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 03 20:03:07.854877 master-0 kubenswrapper[9368]: I1203 20:03:07.854758 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:07.855149 master-0 kubenswrapper[9368]: I1203 20:03:07.854871 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:08.609519 master-0 kubenswrapper[9368]: I1203 20:03:08.609332 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:10.854366 master-0 kubenswrapper[9368]: I1203 20:03:10.854248 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:10.854366 master-0 kubenswrapper[9368]: I1203 20:03:10.854346 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:10.938971 master-0 kubenswrapper[9368]: I1203 20:03:10.938856 9368 status_manager.go:851] "Failed to get status for pod" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods openshift-apiserver-operator-667484ff5-lsltt)" Dec 03 20:03:12.325006 master-0 kubenswrapper[9368]: E1203 20:03:12.324848 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:03:13.854054 master-0 kubenswrapper[9368]: I1203 20:03:13.853921 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:13.854054 master-0 kubenswrapper[9368]: I1203 20:03:13.854025 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:16.854045 master-0 kubenswrapper[9368]: I1203 20:03:16.853940 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:16.854045 master-0 kubenswrapper[9368]: I1203 20:03:16.854068 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:18.609420 master-0 kubenswrapper[9368]: I1203 20:03:18.609311 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:19.853915 master-0 kubenswrapper[9368]: I1203 20:03:19.853578 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:19.853915 master-0 kubenswrapper[9368]: I1203 20:03:19.853664 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:21.684718 master-0 kubenswrapper[9368]: E1203 20:03:21.684579 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:03:22.326112 master-0 kubenswrapper[9368]: E1203 20:03:22.326018 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:03:22.326112 master-0 kubenswrapper[9368]: E1203 20:03:22.326095 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:03:22.854078 master-0 kubenswrapper[9368]: I1203 20:03:22.853978 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:22.855133 master-0 kubenswrapper[9368]: I1203 20:03:22.854104 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:24.699041 master-0 kubenswrapper[9368]: E1203 20:03:24.698755 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-cloud-controller-manager-operator-76f56467d7-npd99.187dccd4bf87a7ef openshift-cloud-controller-manager-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cloud-controller-manager-operator,Name:cluster-cloud-controller-manager-operator-76f56467d7-npd99,UID:61b16a8a-27a2-4a07-b5f9-10a5be2ec870,APIVersion:v1,ResourceVersion:8076,FieldPath:spec.containers{cluster-cloud-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\" in 45.496s (45.496s including waiting). Image size: 551903461 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.117574127 +0000 UTC m=+108.778824078,LastTimestamp:2025-12-03 19:57:23.117574127 +0000 UTC m=+108.778824078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:03:25.854149 master-0 kubenswrapper[9368]: I1203 20:03:25.853992 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:25.854149 master-0 kubenswrapper[9368]: I1203 20:03:25.854098 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:27.861598 master-0 kubenswrapper[9368]: I1203 20:03:27.861527 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/3.log" Dec 03 20:03:27.862486 master-0 kubenswrapper[9368]: I1203 20:03:27.862245 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/2.log" Dec 03 20:03:27.863171 master-0 kubenswrapper[9368]: I1203 20:03:27.863125 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 20:03:27.864576 master-0 kubenswrapper[9368]: I1203 20:03:27.864032 9368 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="d44dad492e3736c612049c8b048068de134aee1a61264b8715dac1a1505eb90d" exitCode=255 Dec 03 20:03:27.867160 master-0 kubenswrapper[9368]: I1203 20:03:27.867106 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/3.log" Dec 03 20:03:27.868243 master-0 kubenswrapper[9368]: I1203 20:03:27.868124 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/2.log" Dec 03 20:03:27.869205 master-0 kubenswrapper[9368]: I1203 20:03:27.869157 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 20:03:27.870348 master-0 kubenswrapper[9368]: I1203 20:03:27.870248 9368 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="d191b57d0995c3a104c3336c01e2a5bd2bc868dba6a6fcca53d04e312b18c0c9" exitCode=255 Dec 03 20:03:27.872917 master-0 kubenswrapper[9368]: I1203 20:03:27.872864 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:03:27.873577 master-0 kubenswrapper[9368]: I1203 20:03:27.873516 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/2.log" Dec 03 20:03:27.874159 master-0 kubenswrapper[9368]: I1203 20:03:27.874111 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 20:03:27.874743 master-0 kubenswrapper[9368]: I1203 20:03:27.874693 9368 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="d4087ecceb78b95c5961d00b583ffbdd19fde6d2e05194469b5beb565e8c4e58" exitCode=255 Dec 03 20:03:27.877144 master-0 kubenswrapper[9368]: I1203 20:03:27.877092 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/3.log" Dec 03 20:03:27.877973 master-0 kubenswrapper[9368]: I1203 20:03:27.877894 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/2.log" Dec 03 20:03:27.878590 master-0 kubenswrapper[9368]: I1203 20:03:27.878528 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 20:03:27.879601 master-0 kubenswrapper[9368]: I1203 20:03:27.879531 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/0.log" Dec 03 20:03:27.879765 master-0 kubenswrapper[9368]: I1203 20:03:27.879620 9368 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="6d3d33e94c6f769c3d4f30283e26a8ebfb068648191bff388aba17779108057c" exitCode=255 Dec 03 20:03:27.882486 master-0 kubenswrapper[9368]: I1203 20:03:27.882425 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/3.log" Dec 03 20:03:27.883443 master-0 kubenswrapper[9368]: I1203 20:03:27.883385 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/2.log" Dec 03 20:03:27.885411 master-0 kubenswrapper[9368]: I1203 20:03:27.885357 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 20:03:27.885986 master-0 kubenswrapper[9368]: I1203 20:03:27.885928 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="6b7ea8626bddf0947a6929d715c64bbadf4eccc528c9e9ac527e662555f2ab85" exitCode=255 Dec 03 20:03:27.888496 master-0 kubenswrapper[9368]: I1203 20:03:27.888432 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/2.log" Dec 03 20:03:27.889259 master-0 kubenswrapper[9368]: I1203 20:03:27.889203 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/1.log" Dec 03 20:03:27.889903 master-0 kubenswrapper[9368]: I1203 20:03:27.889835 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 20:03:27.890000 master-0 kubenswrapper[9368]: I1203 20:03:27.889895 9368 generic.go:334] "Generic (PLEG): container finished" podID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" containerID="b30a30d243315200a6f03be3c0553cf1e0283ee13ed3b826cd4d8aa9d7481e81" exitCode=255 Dec 03 20:03:27.891848 master-0 kubenswrapper[9368]: I1203 20:03:27.891762 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/3.log" Dec 03 20:03:27.892465 master-0 kubenswrapper[9368]: I1203 20:03:27.892406 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/2.log" Dec 03 20:03:27.893295 master-0 kubenswrapper[9368]: I1203 20:03:27.893230 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 20:03:27.894064 master-0 kubenswrapper[9368]: I1203 20:03:27.894008 9368 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="f3b5610345e0a05c927b635b9b59c02c0bd317dc652790faf73852f8095009c9" exitCode=255 Dec 03 20:03:28.609081 master-0 kubenswrapper[9368]: I1203 20:03:28.608979 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:28.853735 master-0 kubenswrapper[9368]: I1203 20:03:28.853659 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:28.854096 master-0 kubenswrapper[9368]: I1203 20:03:28.853742 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:28.902897 master-0 kubenswrapper[9368]: I1203 20:03:28.902688 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/3.log" Dec 03 20:03:28.903866 master-0 kubenswrapper[9368]: I1203 20:03:28.903189 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/2.log" Dec 03 20:03:28.903866 master-0 kubenswrapper[9368]: I1203 20:03:28.903621 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 20:03:28.904435 master-0 kubenswrapper[9368]: I1203 20:03:28.904354 9368 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="fd5126a03583a9e60c4f08ab94ff3e4d6dff99b77efc94559f88151386831a39" exitCode=255 Dec 03 20:03:28.906813 master-0 kubenswrapper[9368]: I1203 20:03:28.906715 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/3.log" Dec 03 20:03:28.907474 master-0 kubenswrapper[9368]: I1203 20:03:28.907415 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/2.log" Dec 03 20:03:28.908121 master-0 kubenswrapper[9368]: I1203 20:03:28.908062 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 20:03:28.908624 master-0 kubenswrapper[9368]: I1203 20:03:28.908567 9368 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="89033761971c21121ad0eb89f27a17b463a2b2ad814a0f77f8444c0013b9927d" exitCode=255 Dec 03 20:03:31.854274 master-0 kubenswrapper[9368]: I1203 20:03:31.854182 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:31.855189 master-0 kubenswrapper[9368]: I1203 20:03:31.854281 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:34.853729 master-0 kubenswrapper[9368]: I1203 20:03:34.853636 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:34.854563 master-0 kubenswrapper[9368]: I1203 20:03:34.853725 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:37.853459 master-0 kubenswrapper[9368]: I1203 20:03:37.853334 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:37.853459 master-0 kubenswrapper[9368]: I1203 20:03:37.853431 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:37.937214 master-0 kubenswrapper[9368]: I1203 20:03:37.937112 9368 scope.go:117] "RemoveContainer" containerID="2eacfef43cc179b008d0d050644e4aa26e93edb95b342c88f74321432bd7fc00" Dec 03 20:03:37.959843 master-0 kubenswrapper[9368]: I1203 20:03:37.959751 9368 scope.go:117] "RemoveContainer" containerID="d5e5f345f4c7214304a5c25631f848938166f13ee76c5366965060641404f3cc" Dec 03 20:03:37.981231 master-0 kubenswrapper[9368]: I1203 20:03:37.981157 9368 scope.go:117] "RemoveContainer" containerID="915fbda281e49c1b3d5c238f4642cee7ff396ff14f35312879fbf5a135ba0426" Dec 03 20:03:38.006547 master-0 kubenswrapper[9368]: I1203 20:03:38.006488 9368 scope.go:117] "RemoveContainer" containerID="794beba2362386c338599c102e787bfbcb667a8f297d93f341ccc297bdb73087" Dec 03 20:03:38.047505 master-0 kubenswrapper[9368]: I1203 20:03:38.047444 9368 scope.go:117] "RemoveContainer" containerID="23b4f3f34e8595251e0fdeffba36a81024e5f343e733b49e23a5e472d12bfa81" Dec 03 20:03:38.098037 master-0 kubenswrapper[9368]: I1203 20:03:38.097930 9368 scope.go:117] "RemoveContainer" containerID="e73e12ce13ca81b680321fa012f494204d85d5e6386ba40c3313c0c4756967da" Dec 03 20:03:38.146580 master-0 kubenswrapper[9368]: I1203 20:03:38.146526 9368 scope.go:117] "RemoveContainer" containerID="8ee6a0b56a85c0d14ad54d2283fc55b5a9f7a55c73d41cd24b0430be03f47449" Dec 03 20:03:38.191437 master-0 kubenswrapper[9368]: I1203 20:03:38.189485 9368 scope.go:117] "RemoveContainer" containerID="67df0016b48dcce14201ac3044aca405e44a73dd4f2748c38de589d5302c6d89" Dec 03 20:03:38.215953 master-0 kubenswrapper[9368]: I1203 20:03:38.215907 9368 scope.go:117] "RemoveContainer" containerID="74b33948f209172661a41eab8dd989534e03391e2f9b3dab897af1dbb663716c" Dec 03 20:03:38.235547 master-0 kubenswrapper[9368]: I1203 20:03:38.235489 9368 scope.go:117] "RemoveContainer" containerID="86fb2ded70064a9e30cf3bd596a82e68f52a88cf948050917e5c6fb69423eb23" Dec 03 20:03:38.259677 master-0 kubenswrapper[9368]: I1203 20:03:38.259637 9368 scope.go:117] "RemoveContainer" containerID="5b669ed74eaf8bfa020c73b3caed3c1731e9f130494d0a6716eecb9c6dd302d9" Dec 03 20:03:38.293848 master-0 kubenswrapper[9368]: I1203 20:03:38.293719 9368 scope.go:117] "RemoveContainer" containerID="e25a90c6c614930a0aba8ebec6ee17a1bf73a834467d4ec954b7d5ad039662fb" Dec 03 20:03:38.317267 master-0 kubenswrapper[9368]: I1203 20:03:38.317086 9368 scope.go:117] "RemoveContainer" containerID="5368f3d8c609d03f47b3a2379952daea482ac8f810b561b93821ae543a16d61e" Dec 03 20:03:38.609638 master-0 kubenswrapper[9368]: I1203 20:03:38.609507 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:38.682685 master-0 kubenswrapper[9368]: E1203 20:03:38.682599 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:03:38.682685 master-0 kubenswrapper[9368]: I1203 20:03:38.682651 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:03:38.686317 master-0 kubenswrapper[9368]: E1203 20:03:38.686249 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:03:38.986300 master-0 kubenswrapper[9368]: I1203 20:03:38.986113 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/3.log" Dec 03 20:03:38.987113 master-0 kubenswrapper[9368]: I1203 20:03:38.987058 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/2.log" Dec 03 20:03:38.987833 master-0 kubenswrapper[9368]: I1203 20:03:38.987750 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 20:03:38.990399 master-0 kubenswrapper[9368]: I1203 20:03:38.990350 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/3.log" Dec 03 20:03:38.991140 master-0 kubenswrapper[9368]: I1203 20:03:38.991087 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/2.log" Dec 03 20:03:38.991823 master-0 kubenswrapper[9368]: I1203 20:03:38.991748 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 20:03:38.994410 master-0 kubenswrapper[9368]: I1203 20:03:38.994354 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/3.log" Dec 03 20:03:38.995218 master-0 kubenswrapper[9368]: I1203 20:03:38.995149 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/2.log" Dec 03 20:03:38.996048 master-0 kubenswrapper[9368]: I1203 20:03:38.995982 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 20:03:38.998701 master-0 kubenswrapper[9368]: I1203 20:03:38.998633 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/3.log" Dec 03 20:03:38.999463 master-0 kubenswrapper[9368]: I1203 20:03:38.999406 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/2.log" Dec 03 20:03:39.000333 master-0 kubenswrapper[9368]: I1203 20:03:39.000280 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 20:03:39.005605 master-0 kubenswrapper[9368]: I1203 20:03:39.005541 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:03:39.006376 master-0 kubenswrapper[9368]: I1203 20:03:39.006327 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/2.log" Dec 03 20:03:39.007309 master-0 kubenswrapper[9368]: I1203 20:03:39.007256 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 20:03:39.010145 master-0 kubenswrapper[9368]: I1203 20:03:39.010075 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/3.log" Dec 03 20:03:39.010841 master-0 kubenswrapper[9368]: I1203 20:03:39.010765 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/2.log" Dec 03 20:03:39.011562 master-0 kubenswrapper[9368]: I1203 20:03:39.011505 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 20:03:39.013438 master-0 kubenswrapper[9368]: I1203 20:03:39.013397 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/3.log" Dec 03 20:03:39.014226 master-0 kubenswrapper[9368]: I1203 20:03:39.014171 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/2.log" Dec 03 20:03:39.015586 master-0 kubenswrapper[9368]: I1203 20:03:39.015101 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 20:03:39.017455 master-0 kubenswrapper[9368]: I1203 20:03:39.017404 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/2.log" Dec 03 20:03:39.018067 master-0 kubenswrapper[9368]: I1203 20:03:39.018020 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/1.log" Dec 03 20:03:39.020367 master-0 kubenswrapper[9368]: I1203 20:03:39.020311 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/3.log" Dec 03 20:03:39.020987 master-0 kubenswrapper[9368]: I1203 20:03:39.020941 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/2.log" Dec 03 20:03:39.021690 master-0 kubenswrapper[9368]: I1203 20:03:39.021632 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 20:03:40.853544 master-0 kubenswrapper[9368]: I1203 20:03:40.853492 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:40.854354 master-0 kubenswrapper[9368]: I1203 20:03:40.854166 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:41.035017 master-0 kubenswrapper[9368]: I1203 20:03:41.034943 9368 generic.go:334] "Generic (PLEG): container finished" podID="b8709c6c-8729-4702-a3fb-35a072855096" containerID="f74560024271b473d288e14ac60c9ecd05f2a6752be21eac89b4a74e35f9a5d8" exitCode=0 Dec 03 20:03:42.490121 master-0 kubenswrapper[9368]: E1203 20:03:42.489900 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:03:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:03:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:03:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:03:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:03:43.854580 master-0 kubenswrapper[9368]: I1203 20:03:43.854479 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:43.856053 master-0 kubenswrapper[9368]: I1203 20:03:43.855984 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:46.853660 master-0 kubenswrapper[9368]: I1203 20:03:46.853591 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:46.854773 master-0 kubenswrapper[9368]: I1203 20:03:46.854010 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:48.610294 master-0 kubenswrapper[9368]: I1203 20:03:48.610139 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:49.853122 master-0 kubenswrapper[9368]: I1203 20:03:49.853014 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:49.853122 master-0 kubenswrapper[9368]: I1203 20:03:49.853101 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:52.492260 master-0 kubenswrapper[9368]: E1203 20:03:52.492164 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:03:52.853695 master-0 kubenswrapper[9368]: I1203 20:03:52.853461 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:52.853695 master-0 kubenswrapper[9368]: I1203 20:03:52.853589 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:55.687323 master-0 kubenswrapper[9368]: E1203 20:03:55.686839 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:03:55.854179 master-0 kubenswrapper[9368]: I1203 20:03:55.854053 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:55.854179 master-0 kubenswrapper[9368]: I1203 20:03:55.854140 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:03:58.165357 master-0 kubenswrapper[9368]: I1203 20:03:58.165293 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/2.log" Dec 03 20:03:58.167171 master-0 kubenswrapper[9368]: I1203 20:03:58.167100 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/1.log" Dec 03 20:03:58.168888 master-0 kubenswrapper[9368]: I1203 20:03:58.168822 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 20:03:58.169008 master-0 kubenswrapper[9368]: I1203 20:03:58.168911 9368 generic.go:334] "Generic (PLEG): container finished" podID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" containerID="befef1f27ec31a7dce800c4fe3b217c928cd2c29d212afb9d75ef9e969b32b96" exitCode=1 Dec 03 20:03:58.609427 master-0 kubenswrapper[9368]: I1203 20:03:58.609323 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:03:58.703823 master-0 kubenswrapper[9368]: E1203 20:03:58.703498 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{insights-operator-59d99f9b7b-h64kt.187dccd4bf88b245 openshift-insights 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-insights,Name:insights-operator-59d99f9b7b-h64kt,UID:af2023e1-9c7a-40af-a6bf-fba31c3565b1,APIVersion:v1,ResourceVersion:7963,FieldPath:spec.containers{insights-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\" in 44.419s (44.419s including waiting). Image size: 499138950 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.117642309 +0000 UTC m=+108.778892220,LastTimestamp:2025-12-03 19:57:23.117642309 +0000 UTC m=+108.778892220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:03:58.853913 master-0 kubenswrapper[9368]: I1203 20:03:58.853748 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:03:58.853913 master-0 kubenswrapper[9368]: I1203 20:03:58.853886 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:01.853169 master-0 kubenswrapper[9368]: I1203 20:04:01.853043 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:01.853169 master-0 kubenswrapper[9368]: I1203 20:04:01.853138 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:02.492720 master-0 kubenswrapper[9368]: E1203 20:04:02.492663 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:04:04.854173 master-0 kubenswrapper[9368]: I1203 20:04:04.854051 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:04.854981 master-0 kubenswrapper[9368]: I1203 20:04:04.854168 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:07.662306 master-0 kubenswrapper[9368]: I1203 20:04:07.661829 9368 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-mqpzf container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 03 20:04:07.662306 master-0 kubenswrapper[9368]: I1203 20:04:07.661919 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 03 20:04:07.853585 master-0 kubenswrapper[9368]: I1203 20:04:07.853476 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:07.853585 master-0 kubenswrapper[9368]: I1203 20:04:07.853566 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:08.610069 master-0 kubenswrapper[9368]: I1203 20:04:08.609994 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:10.853334 master-0 kubenswrapper[9368]: I1203 20:04:10.853243 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:10.854349 master-0 kubenswrapper[9368]: I1203 20:04:10.853330 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:10.940997 master-0 kubenswrapper[9368]: I1203 20:04:10.940881 9368 status_manager.go:851] "Failed to get status for pod" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods service-ca-operator-56f5898f45-v6rp5)" Dec 03 20:04:12.493403 master-0 kubenswrapper[9368]: E1203 20:04:12.493305 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:04:12.687158 master-0 kubenswrapper[9368]: E1203 20:04:12.687014 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 20:04:12.687486 master-0 kubenswrapper[9368]: E1203 20:04:12.687268 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.023s" Dec 03 20:04:12.687486 master-0 kubenswrapper[9368]: I1203 20:04:12.687305 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerDied","Data":"384902c9d5118b992b516df4665219d1bebf7324327cde78b939566df8720f4b"} Dec 03 20:04:12.688472 master-0 kubenswrapper[9368]: E1203 20:04:12.688370 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:04:12.699437 master-0 kubenswrapper[9368]: I1203 20:04:12.699356 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:04:13.853501 master-0 kubenswrapper[9368]: I1203 20:04:13.853394 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:13.853501 master-0 kubenswrapper[9368]: I1203 20:04:13.853476 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:16.853988 master-0 kubenswrapper[9368]: I1203 20:04:16.853878 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:16.855054 master-0 kubenswrapper[9368]: I1203 20:04:16.853984 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:18.609358 master-0 kubenswrapper[9368]: I1203 20:04:18.609242 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:19.853475 master-0 kubenswrapper[9368]: I1203 20:04:19.853315 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:19.853475 master-0 kubenswrapper[9368]: I1203 20:04:19.853415 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:22.494424 master-0 kubenswrapper[9368]: E1203 20:04:22.494021 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:04:22.494424 master-0 kubenswrapper[9368]: E1203 20:04:22.494102 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:04:22.854195 master-0 kubenswrapper[9368]: I1203 20:04:22.854055 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:22.854195 master-0 kubenswrapper[9368]: I1203 20:04:22.854114 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:25.693174 master-0 kubenswrapper[9368]: E1203 20:04:25.693055 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:04:25.853136 master-0 kubenswrapper[9368]: I1203 20:04:25.853035 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:25.853136 master-0 kubenswrapper[9368]: I1203 20:04:25.853117 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:28.609401 master-0 kubenswrapper[9368]: I1203 20:04:28.609296 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:28.854124 master-0 kubenswrapper[9368]: I1203 20:04:28.853996 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:28.854467 master-0 kubenswrapper[9368]: I1203 20:04:28.854121 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:29.690142 master-0 kubenswrapper[9368]: E1203 20:04:29.690078 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:04:31.853822 master-0 kubenswrapper[9368]: I1203 20:04:31.853721 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:31.854461 master-0 kubenswrapper[9368]: I1203 20:04:31.853841 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:32.706538 master-0 kubenswrapper[9368]: E1203 20:04:32.706352 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-storage-operator-f84784664-wnl8p.187dccd4bf8bcbc4 openshift-cluster-storage-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-storage-operator,Name:cluster-storage-operator-f84784664-wnl8p,UID:f749c7f2-1fd7-4078-a92d-0ae5523998ac,APIVersion:v1,ResourceVersion:7945,FieldPath:spec.containers{cluster-storage-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\" in 44.617s (44.617s including waiting). Image size: 508056015 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.117845444 +0000 UTC m=+108.779095345,LastTimestamp:2025-12-03 19:57:23.117845444 +0000 UTC m=+108.779095345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:04:34.853529 master-0 kubenswrapper[9368]: I1203 20:04:34.853439 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:34.854264 master-0 kubenswrapper[9368]: I1203 20:04:34.853521 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:37.853834 master-0 kubenswrapper[9368]: I1203 20:04:37.853713 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:37.854650 master-0 kubenswrapper[9368]: I1203 20:04:37.853923 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:38.609618 master-0 kubenswrapper[9368]: I1203 20:04:38.609498 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:40.853757 master-0 kubenswrapper[9368]: I1203 20:04:40.853631 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:40.853757 master-0 kubenswrapper[9368]: I1203 20:04:40.853723 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:42.846753 master-0 kubenswrapper[9368]: E1203 20:04:42.846457 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:04:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:04:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:04:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:04:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:04:43.854218 master-0 kubenswrapper[9368]: I1203 20:04:43.854148 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:43.855128 master-0 kubenswrapper[9368]: I1203 20:04:43.854999 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:46.692572 master-0 kubenswrapper[9368]: E1203 20:04:46.692436 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:04:46.702662 master-0 kubenswrapper[9368]: E1203 20:04:46.702576 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:04:46.702662 master-0 kubenswrapper[9368]: I1203 20:04:46.702640 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:04:46.853772 master-0 kubenswrapper[9368]: I1203 20:04:46.853654 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:46.853772 master-0 kubenswrapper[9368]: I1203 20:04:46.853760 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:48.610077 master-0 kubenswrapper[9368]: I1203 20:04:48.609971 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:49.853902 master-0 kubenswrapper[9368]: I1203 20:04:49.853762 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:49.853902 master-0 kubenswrapper[9368]: I1203 20:04:49.853891 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:52.847630 master-0 kubenswrapper[9368]: E1203 20:04:52.847520 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:04:52.853625 master-0 kubenswrapper[9368]: I1203 20:04:52.853551 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:52.853750 master-0 kubenswrapper[9368]: I1203 20:04:52.853645 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:55.853420 master-0 kubenswrapper[9368]: I1203 20:04:55.853299 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:55.853420 master-0 kubenswrapper[9368]: I1203 20:04:55.853391 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:04:58.609503 master-0 kubenswrapper[9368]: I1203 20:04:58.609349 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:04:58.853243 master-0 kubenswrapper[9368]: I1203 20:04:58.853120 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:04:58.853546 master-0 kubenswrapper[9368]: I1203 20:04:58.853232 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:01.853931 master-0 kubenswrapper[9368]: I1203 20:05:01.853844 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:01.854881 master-0 kubenswrapper[9368]: I1203 20:05:01.853932 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:02.848377 master-0 kubenswrapper[9368]: E1203 20:05:02.848239 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:03.694873 master-0 kubenswrapper[9368]: E1203 20:05:03.694826 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:05:04.853708 master-0 kubenswrapper[9368]: I1203 20:05:04.853620 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:04.853708 master-0 kubenswrapper[9368]: I1203 20:05:04.853701 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:06.709500 master-0 kubenswrapper[9368]: E1203 20:05:06.709309 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-marketplace-mc8kx.187dccd4bfeae13f openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-mc8kx,UID:81839b26-cf66-4532-a646-ef4cd5d5e471,APIVersion:v1,ResourceVersion:7041,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\" in 44.805s (44.805s including waiting). Image size: 912736453 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.124076863 +0000 UTC m=+108.785326784,LastTimestamp:2025-12-03 19:57:23.124076863 +0000 UTC m=+108.785326784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:05:07.661519 master-0 kubenswrapper[9368]: I1203 20:05:07.661433 9368 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-mqpzf container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 03 20:05:07.661834 master-0 kubenswrapper[9368]: I1203 20:05:07.661531 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 03 20:05:07.853495 master-0 kubenswrapper[9368]: I1203 20:05:07.853358 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:07.854351 master-0 kubenswrapper[9368]: I1203 20:05:07.853486 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:08.609710 master-0 kubenswrapper[9368]: I1203 20:05:08.609551 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:05:10.853653 master-0 kubenswrapper[9368]: I1203 20:05:10.853558 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:10.853653 master-0 kubenswrapper[9368]: I1203 20:05:10.853644 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:10.943095 master-0 kubenswrapper[9368]: I1203 20:05:10.942964 9368 status_manager.go:851] "Failed to get status for pod" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" pod="openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods route-controller-manager-869d689b5b-brqck)" Dec 03 20:05:12.849640 master-0 kubenswrapper[9368]: E1203 20:05:12.849562 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:13.853635 master-0 kubenswrapper[9368]: I1203 20:05:13.853528 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:13.853635 master-0 kubenswrapper[9368]: I1203 20:05:13.853623 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:16.853228 master-0 kubenswrapper[9368]: I1203 20:05:16.853150 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:16.853228 master-0 kubenswrapper[9368]: I1203 20:05:16.853226 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:18.609571 master-0 kubenswrapper[9368]: I1203 20:05:18.609443 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 03 20:05:19.854117 master-0 kubenswrapper[9368]: I1203 20:05:19.853999 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Dec 03 20:05:19.854117 master-0 kubenswrapper[9368]: I1203 20:05:19.854100 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Dec 03 20:05:20.696634 master-0 kubenswrapper[9368]: E1203 20:05:20.696508 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:05:20.705705 master-0 kubenswrapper[9368]: E1203 20:05:20.705642 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 20:05:20.705967 master-0 kubenswrapper[9368]: E1203 20:05:20.705923 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1m8.019s" Dec 03 20:05:20.706054 master-0 kubenswrapper[9368]: I1203 20:05:20.705964 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerDied","Data":"915fbda281e49c1b3d5c238f4642cee7ff396ff14f35312879fbf5a135ba0426"} Dec 03 20:05:20.706054 master-0 kubenswrapper[9368]: I1203 20:05:20.706001 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:05:20.706885 master-0 kubenswrapper[9368]: I1203 20:05:20.706829 9368 scope.go:117] "RemoveContainer" containerID="ce3971a00b14ee7d8820c7e2ce38f070172641049e39dce3eb3a076d83a464ea" Dec 03 20:05:20.707261 master-0 kubenswrapper[9368]: I1203 20:05:20.707195 9368 scope.go:117] "RemoveContainer" containerID="6f8d03455884710e737b779ab993de7b077a6712d61dd531eb926a20dcac48c1" Dec 03 20:05:20.707695 master-0 kubenswrapper[9368]: I1203 20:05:20.707578 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerStarted","Data":"2eacfef43cc179b008d0d050644e4aa26e93edb95b342c88f74321432bd7fc00"} Dec 03 20:05:20.707875 master-0 kubenswrapper[9368]: I1203 20:05:20.707732 9368 scope.go:117] "RemoveContainer" containerID="0714d8c339d81fe37d65f8b61284fb17442521338c0d1beb9a6cde0e4b83dcaa" Dec 03 20:05:20.709283 master-0 kubenswrapper[9368]: I1203 20:05:20.709037 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:05:20.709434 master-0 kubenswrapper[9368]: I1203 20:05:20.709349 9368 scope.go:117] "RemoveContainer" containerID="ecf333f033fb5f8af44f74367011135c5c68151c236ed2fb6c9deb690a21c615" Dec 03 20:05:20.709434 master-0 kubenswrapper[9368]: I1203 20:05:20.709395 9368 scope.go:117] "RemoveContainer" containerID="23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc" Dec 03 20:05:20.710011 master-0 kubenswrapper[9368]: I1203 20:05:20.709580 9368 scope.go:117] "RemoveContainer" containerID="59561622c420df151d8043e444eaec7dca0c22e244b1a6ac8880f20fe809e5c4" Dec 03 20:05:20.710011 master-0 kubenswrapper[9368]: I1203 20:05:20.709893 9368 scope.go:117] "RemoveContainer" containerID="fbb527c9a5f9ae83b24668268584afb30442540a16ac4e78c92bdf23a3df3b8c" Dec 03 20:05:20.710256 master-0 kubenswrapper[9368]: I1203 20:05:20.710107 9368 scope.go:117] "RemoveContainer" containerID="1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c" Dec 03 20:05:20.711677 master-0 kubenswrapper[9368]: I1203 20:05:20.710759 9368 scope.go:117] "RemoveContainer" containerID="89033761971c21121ad0eb89f27a17b463a2b2ad814a0f77f8444c0013b9927d" Dec 03 20:05:20.711677 master-0 kubenswrapper[9368]: I1203 20:05:20.711242 9368 scope.go:117] "RemoveContainer" containerID="b30a30d243315200a6f03be3c0553cf1e0283ee13ed3b826cd4d8aa9d7481e81" Dec 03 20:05:20.712354 master-0 kubenswrapper[9368]: I1203 20:05:20.711863 9368 scope.go:117] "RemoveContainer" containerID="d4087ecceb78b95c5961d00b583ffbdd19fde6d2e05194469b5beb565e8c4e58" Dec 03 20:05:20.712354 master-0 kubenswrapper[9368]: I1203 20:05:20.712180 9368 scope.go:117] "RemoveContainer" containerID="9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52" Dec 03 20:05:20.714531 master-0 kubenswrapper[9368]: I1203 20:05:20.713562 9368 scope.go:117] "RemoveContainer" containerID="6b7ea8626bddf0947a6929d715c64bbadf4eccc528c9e9ac527e662555f2ab85" Dec 03 20:05:20.714531 master-0 kubenswrapper[9368]: I1203 20:05:20.713931 9368 scope.go:117] "RemoveContainer" containerID="2f3d798fc128d08f2b78c16a96552eb1af844c024c5ff08c6a9c3b2ad0da6b71" Dec 03 20:05:20.714531 master-0 kubenswrapper[9368]: I1203 20:05:20.714222 9368 scope.go:117] "RemoveContainer" containerID="f3b5610345e0a05c927b635b9b59c02c0bd317dc652790faf73852f8095009c9" Dec 03 20:05:20.715170 master-0 kubenswrapper[9368]: I1203 20:05:20.714978 9368 scope.go:117] "RemoveContainer" containerID="6d3d33e94c6f769c3d4f30283e26a8ebfb068648191bff388aba17779108057c" Dec 03 20:05:20.718263 master-0 kubenswrapper[9368]: I1203 20:05:20.715647 9368 scope.go:117] "RemoveContainer" containerID="193ee1ad3e7ee183f1ea38494d7735760027689afd79629a8d160747a2494f67" Dec 03 20:05:20.718263 master-0 kubenswrapper[9368]: I1203 20:05:20.717047 9368 scope.go:117] "RemoveContainer" containerID="b15d5b3401a95a50f5c18b6410300731cd922d460a927b29c822856e4c00523b" Dec 03 20:05:20.718263 master-0 kubenswrapper[9368]: I1203 20:05:20.717187 9368 scope.go:117] "RemoveContainer" containerID="33fc3458349b78bc19c8b30395e299c49cdfbf37f7e541929fe27fba4fc59440" Dec 03 20:05:20.719835 master-0 kubenswrapper[9368]: I1203 20:05:20.718588 9368 scope.go:117] "RemoveContainer" containerID="d191b57d0995c3a104c3336c01e2a5bd2bc868dba6a6fcca53d04e312b18c0c9" Dec 03 20:05:20.719835 master-0 kubenswrapper[9368]: I1203 20:05:20.719110 9368 scope.go:117] "RemoveContainer" containerID="9936bd164d7a83dfd6c86c4312838d63181895add63b7d1de35a090b8b7d369b" Dec 03 20:05:20.719835 master-0 kubenswrapper[9368]: I1203 20:05:20.719308 9368 scope.go:117] "RemoveContainer" containerID="befef1f27ec31a7dce800c4fe3b217c928cd2c29d212afb9d75ef9e969b32b96" Dec 03 20:05:20.719835 master-0 kubenswrapper[9368]: I1203 20:05:20.719587 9368 scope.go:117] "RemoveContainer" containerID="d44dad492e3736c612049c8b048068de134aee1a61264b8715dac1a1505eb90d" Dec 03 20:05:20.720591 master-0 kubenswrapper[9368]: I1203 20:05:20.720290 9368 scope.go:117] "RemoveContainer" containerID="f74560024271b473d288e14ac60c9ecd05f2a6752be21eac89b4a74e35f9a5d8" Dec 03 20:05:20.723280 master-0 kubenswrapper[9368]: I1203 20:05:20.723222 9368 scope.go:117] "RemoveContainer" containerID="fd5126a03583a9e60c4f08ab94ff3e4d6dff99b77efc94559f88151386831a39" Dec 03 20:05:20.723543 master-0 kubenswrapper[9368]: I1203 20:05:20.723487 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:05:20.723836 master-0 kubenswrapper[9368]: I1203 20:05:20.723768 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:05:20.724012 master-0 kubenswrapper[9368]: I1203 20:05:20.723955 9368 scope.go:117] "RemoveContainer" containerID="2dd513c4c7700ec665cd85658968cfa47ab585f4855779f0285e2f319e1b23ec" Dec 03 20:05:21.768728 master-0 kubenswrapper[9368]: I1203 20:05:21.768668 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/2.log" Dec 03 20:05:21.769388 master-0 kubenswrapper[9368]: I1203 20:05:21.769099 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/1.log" Dec 03 20:05:21.776561 master-0 kubenswrapper[9368]: I1203 20:05:21.776539 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/2.log" Dec 03 20:05:21.777035 master-0 kubenswrapper[9368]: I1203 20:05:21.777011 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/1.log" Dec 03 20:05:21.777477 master-0 kubenswrapper[9368]: I1203 20:05:21.777446 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 20:05:21.779863 master-0 kubenswrapper[9368]: I1203 20:05:21.779839 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/2.log" Dec 03 20:05:21.780247 master-0 kubenswrapper[9368]: I1203 20:05:21.780224 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/1.log" Dec 03 20:05:21.780982 master-0 kubenswrapper[9368]: I1203 20:05:21.780957 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/0.log" Dec 03 20:05:21.783090 master-0 kubenswrapper[9368]: I1203 20:05:21.783063 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-vqzdb_7ed25861-1328-45e7-922e-37588a0b019c/cluster-node-tuning-operator/0.log" Dec 03 20:05:21.786156 master-0 kubenswrapper[9368]: I1203 20:05:21.786136 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:05:21.786591 master-0 kubenswrapper[9368]: I1203 20:05:21.786565 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/2.log" Dec 03 20:05:21.786998 master-0 kubenswrapper[9368]: I1203 20:05:21.786979 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/1.log" Dec 03 20:05:21.788893 master-0 kubenswrapper[9368]: I1203 20:05:21.788863 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/3.log" Dec 03 20:05:21.789281 master-0 kubenswrapper[9368]: I1203 20:05:21.789262 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/2.log" Dec 03 20:05:21.789726 master-0 kubenswrapper[9368]: I1203 20:05:21.789678 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/1.log" Dec 03 20:05:21.793986 master-0 kubenswrapper[9368]: I1203 20:05:21.793962 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/2.log" Dec 03 20:05:21.794402 master-0 kubenswrapper[9368]: I1203 20:05:21.794382 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/1.log" Dec 03 20:05:21.795359 master-0 kubenswrapper[9368]: I1203 20:05:21.795331 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/0.log" Dec 03 20:05:21.807288 master-0 kubenswrapper[9368]: I1203 20:05:21.807230 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/3.log" Dec 03 20:05:21.807791 master-0 kubenswrapper[9368]: I1203 20:05:21.807748 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/2.log" Dec 03 20:05:21.808195 master-0 kubenswrapper[9368]: I1203 20:05:21.808175 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/1.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.809714 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/3.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.810039 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/2.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.810350 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/1.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.811653 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/3.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.811966 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/2.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.812313 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/1.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.813560 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/3.log" Dec 03 20:05:21.814011 master-0 kubenswrapper[9368]: I1203 20:05:21.813877 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/2.log" Dec 03 20:05:21.814364 master-0 kubenswrapper[9368]: I1203 20:05:21.814288 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/1.log" Dec 03 20:05:21.815980 master-0 kubenswrapper[9368]: I1203 20:05:21.815952 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-pqz7q_0d4e4f88-7106-4a46-8b63-053345922fb0/package-server-manager/0.log" Dec 03 20:05:21.817896 master-0 kubenswrapper[9368]: I1203 20:05:21.817877 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/3.log" Dec 03 20:05:21.818256 master-0 kubenswrapper[9368]: I1203 20:05:21.818241 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/2.log" Dec 03 20:05:21.818597 master-0 kubenswrapper[9368]: I1203 20:05:21.818575 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/1.log" Dec 03 20:05:21.820867 master-0 kubenswrapper[9368]: I1203 20:05:21.820818 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/2.log" Dec 03 20:05:21.821460 master-0 kubenswrapper[9368]: I1203 20:05:21.821441 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/1.log" Dec 03 20:05:21.821840 master-0 kubenswrapper[9368]: I1203 20:05:21.821825 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/0.log" Dec 03 20:05:21.824932 master-0 kubenswrapper[9368]: I1203 20:05:21.824912 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/3.log" Dec 03 20:05:21.825290 master-0 kubenswrapper[9368]: I1203 20:05:21.825272 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/2.log" Dec 03 20:05:21.825667 master-0 kubenswrapper[9368]: I1203 20:05:21.825647 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/1.log" Dec 03 20:05:22.851081 master-0 kubenswrapper[9368]: E1203 20:05:22.850987 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Dec 03 20:05:22.851081 master-0 kubenswrapper[9368]: E1203 20:05:22.851025 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:05:24.919326 master-0 kubenswrapper[9368]: I1203 20:05:24.919158 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:29.250187 master-0 kubenswrapper[9368]: I1203 20:05:29.250081 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:33.725059 master-0 kubenswrapper[9368]: E1203 20:05:33.724950 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 20:05:34.919315 master-0 kubenswrapper[9368]: I1203 20:05:34.919211 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:37.698910 master-0 kubenswrapper[9368]: E1203 20:05:37.698575 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:05:39.250113 master-0 kubenswrapper[9368]: I1203 20:05:39.249927 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:40.713920 master-0 kubenswrapper[9368]: E1203 20:05:40.713670 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-config-daemon-7t8bs.187dccd4c35af8c8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-7t8bs,UID:9891cf64-59e8-4d8d-94fe-17cfa4b18c1b,APIVersion:v1,ResourceVersion:8463,FieldPath:spec.containers{machine-config-daemon},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.181754568 +0000 UTC m=+108.843004479,LastTimestamp:2025-12-03 19:57:23.181754568 +0000 UTC m=+108.843004479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:05:43.000020 master-0 kubenswrapper[9368]: E1203 20:05:42.999647 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:05:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:05:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:05:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:05:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2efbb545a141552851226bea008b13d92cbb084339bcfd6923b38d23c382145e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:e5c3ad640f9c0c84490a0e0da7a1850b7873867936a5b604c07a8075c3a710d0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:446e5d504e70c7963ef7b0f090f3fcb19847ef48150299bf030847565d7a579b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a01ee07f4838bab6cfa5a3d25d867557aa271725bfcd20a1e52d3cc63423c06b\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204969293},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:31b0e25262b7daa1c7a43042f865ca936aa1a52776994642f88b9a12408d27da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ae694324b195581f542841a64634b63bae3d63332705b3a27320d18fde2aebe8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201363276},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\\\"],\\\"sizeBytes\\\":912736453},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:44.919663 master-0 kubenswrapper[9368]: I1203 20:05:44.919574 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:49.250908 master-0 kubenswrapper[9368]: I1203 20:05:49.250221 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:52.230236 master-0 kubenswrapper[9368]: I1203 20:05:52.230114 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="2bc34dd3df75f29672c73e791045d1e82bca7040b7e6a8728aa43a5fe5c90f24" exitCode=255 Dec 03 20:05:52.232832 master-0 kubenswrapper[9368]: I1203 20:05:52.232760 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/3.log" Dec 03 20:05:52.233621 master-0 kubenswrapper[9368]: I1203 20:05:52.233569 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/2.log" Dec 03 20:05:52.234114 master-0 kubenswrapper[9368]: I1203 20:05:52.234062 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/1.log" Dec 03 20:05:52.234841 master-0 kubenswrapper[9368]: I1203 20:05:52.234747 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/0.log" Dec 03 20:05:52.234937 master-0 kubenswrapper[9368]: I1203 20:05:52.234885 9368 generic.go:334] "Generic (PLEG): container finished" podID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" containerID="8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842" exitCode=1 Dec 03 20:05:53.000534 master-0 kubenswrapper[9368]: E1203 20:05:53.000472 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:05:54.700168 master-0 kubenswrapper[9368]: E1203 20:05:54.700065 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:05:54.727446 master-0 kubenswrapper[9368]: E1203 20:05:54.727356 9368 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 20:05:54.727446 master-0 kubenswrapper[9368]: I1203 20:05:54.727402 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:06:03.001435 master-0 kubenswrapper[9368]: E1203 20:06:03.001382 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:10.944413 master-0 kubenswrapper[9368]: I1203 20:06:10.944326 9368 status_manager.go:851] "Failed to get status for pod" podUID="b2021db5-b27a-4e06-beec-d9ba82aa1ffc" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods cluster-autoscaler-operator-7f88444875-kqfs4)" Dec 03 20:06:11.701948 master-0 kubenswrapper[9368]: E1203 20:06:11.701600 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:06:13.003373 master-0 kubenswrapper[9368]: E1203 20:06:13.003303 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Dec 03 20:06:14.717909 master-0 kubenswrapper[9368]: E1203 20:06:14.717640 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-api-operator-7486ff55f-9p9rq.187dccd4c64abe1b openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:machine-api-operator-7486ff55f-9p9rq,UID:ad22d8ed-2476-441b-aa3b-a7845606b0ac,APIVersion:v1,ResourceVersion:8166,FieldPath:spec.containers{machine-api-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\" in 44.002s (44.002s including waiting). Image size: 856674149 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.231022619 +0000 UTC m=+108.892272540,LastTimestamp:2025-12-03 19:57:23.231022619 +0000 UTC m=+108.892272540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:06:15.768825 master-0 kubenswrapper[9368]: E1203 20:06:15.767292 9368 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="55.06s" Dec 03 20:06:15.768825 master-0 kubenswrapper[9368]: I1203 20:06:15.767343 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630"} Dec 03 20:06:15.769925 master-0 kubenswrapper[9368]: I1203 20:06:15.769540 9368 scope.go:117] "RemoveContainer" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:06:15.773815 master-0 kubenswrapper[9368]: I1203 20:06:15.770524 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 03 20:06:15.773815 master-0 kubenswrapper[9368]: I1203 20:06:15.770559 9368 scope.go:117] "RemoveContainer" containerID="2bc34dd3df75f29672c73e791045d1e82bca7040b7e6a8728aa43a5fe5c90f24" Dec 03 20:06:15.773815 master-0 kubenswrapper[9368]: I1203 20:06:15.770605 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" containerID="cri-o://202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" gracePeriod=30 Dec 03 20:06:15.795865 master-0 kubenswrapper[9368]: I1203 20:06:15.792741 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 20:06:15.821888 master-0 kubenswrapper[9368]: I1203 20:06:15.821199 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831389 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831440 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831451 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831462 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831475 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sp868" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831490 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerDied","Data":"2eacfef43cc179b008d0d050644e4aa26e93edb95b342c88f74321432bd7fc00"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831516 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831551 9368 status_manager.go:379] "Container startup changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831560 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831574 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" containerID="cri-o://efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831582 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831596 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831610 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" containerID="cri-o://3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831621 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831634 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" containerID="cri-o://026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831642 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831656 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831666 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831693 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831705 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831715 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831725 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831736 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831764 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831799 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831817 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831831 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"73fd77c7f3160f50b85cebcaf7773a33c44b0958115b084cb590bef38d48ba5c"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831848 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831860 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831884 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831909 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831920 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831931 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831954 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831968 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"a72510073f92e9ff068e8652b1a65285f64ee333e40d80be23e60bf13a3ce72d"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831979 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.831993 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832002 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832014 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832028 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832039 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"121d9626cd0411e9b91e157dd5da2678c7631550b10f391133d8192123b5c231"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832053 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerDied","Data":"00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832090 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832116 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832126 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832152 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832164 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerDied","Data":"d5e5f345f4c7214304a5c25631f848938166f13ee76c5366965060641404f3cc"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832188 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832202 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99" event={"ID":"61b16a8a-27a2-4a07-b5f9-10a5be2ec870","Type":"ContainerDied","Data":"f99c24374916ccefecbe6788346b4cb9fb3b6dbba7b45f5a9bea3621fcd4bafb"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832213 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r2kpn" event={"ID":"c4d45235-fb1a-4626-a41e-b1e34f7bf76e","Type":"ContainerDied","Data":"65f13f5f310f6f953b71a1a783c24c03bd5eb6d2106c3ba74515208177e8e054"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832228 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832238 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"9afe01c7-825c-43d1-8425-0317cdde11d6","Type":"ContainerDied","Data":"7defd583f52b28f4c8a42f8533bc6a235b9b9753c15d53b3d581070bd6b239c4"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832254 9368 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832263 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832274 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"186cc14f-5f58-43ca-8ffa-db07606ff0f7","Type":"ContainerDied","Data":"5217957523f4b5166716d8ff3b268cfc1e054e38ab89fcd916d9adc0a629dce1"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832286 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerDied","Data":"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832303 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832313 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832323 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerDied","Data":"46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832339 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832350 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832363 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerDied","Data":"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832377 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerDied","Data":"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832391 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerDied","Data":"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832405 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerDied","Data":"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832419 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerDied","Data":"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832433 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerDied","Data":"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832450 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832465 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerDied","Data":"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832478 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerDied","Data":"441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832493 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerDied","Data":"0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832506 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" event={"ID":"b673cb04-f6f0-4113-bdcd-d6685b942c9f","Type":"ContainerDied","Data":"efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832519 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" event={"ID":"73b7027e-44f5-4c7b-9226-585a90530535","Type":"ContainerDied","Data":"3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832531 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" event={"ID":"1f82c7a1-ec21-497d-86f2-562cafa7ace7","Type":"ContainerDied","Data":"026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832543 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerDied","Data":"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832560 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" event={"ID":"d210062f-c07e-419f-a551-c37571565686","Type":"ContainerDied","Data":"2d7be3731fbc745283a2d759f396c31ac1367c0ba714305c646e32b354747fdc"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832572 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"fbb527c9a5f9ae83b24668268584afb30442540a16ac4e78c92bdf23a3df3b8c"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832583 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832593 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"186cc14f-5f58-43ca-8ffa-db07606ff0f7","Type":"ContainerDied","Data":"b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832605 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832615 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" event={"ID":"1f82c7a1-ec21-497d-86f2-562cafa7ace7","Type":"ContainerStarted","Data":"86946c50a244c8ca671f26a68869df0617a67bae9b9fa135a946600795d8f546"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832625 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832636 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832646 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832659 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832670 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832683 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832694 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832705 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832716 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832728 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" event={"ID":"b673cb04-f6f0-4113-bdcd-d6685b942c9f","Type":"ContainerStarted","Data":"feb425dcaaabe6805f86918a38eb057c059970fea17cb869db0d5b239fb81d26"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832738 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"9afe01c7-825c-43d1-8425-0317cdde11d6","Type":"ContainerDied","Data":"d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832748 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158" Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832759 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832770 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r2kpn" event={"ID":"c4d45235-fb1a-4626-a41e-b1e34f7bf76e","Type":"ContainerStarted","Data":"1b9c89a9ed1d6a5edcf56e65d5ad39a012cfdb4a44a26cc4c5d38d4b93a5c317"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832801 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832852 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832876 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832889 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ce4afc7a-a338-4a2c-bada-22d4bac75d49","Type":"ContainerDied","Data":"5c7672f753235f31861db5762e7805d7dbeffaa2c208518211750ae8f4c45f42"} Dec 03 20:06:15.832773 master-0 kubenswrapper[9368]: I1203 20:06:15.832902 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7672f753235f31861db5762e7805d7dbeffaa2c208518211750ae8f4c45f42" Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.832913 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" event={"ID":"73b7027e-44f5-4c7b-9226-585a90530535","Type":"ContainerStarted","Data":"01d4fad7a92f9b0ef4f9b8cabd561aa16080f1f5660decb4d1b69b14510e1322"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.832930 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"3c8f577be66a40b37f0664a12c17056548ea3c9d36cd14f671ca30ad04cfd997"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.832943 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerDied","Data":"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.832964 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerDied","Data":"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.832982 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833002 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerDied","Data":"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833018 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerDied","Data":"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833034 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerDied","Data":"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833047 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833061 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerDied","Data":"fbb527c9a5f9ae83b24668268584afb30442540a16ac4e78c92bdf23a3df3b8c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833077 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerDied","Data":"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833089 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerDied","Data":"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833102 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerDied","Data":"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833114 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerDied","Data":"c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833125 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833139 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833149 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerDied","Data":"8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833159 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833167 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833178 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerDied","Data":"739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833189 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833197 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833207 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerDied","Data":"9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833217 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833228 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" event={"ID":"d210062f-c07e-419f-a551-c37571565686","Type":"ContainerStarted","Data":"c6b3634120684995d22a693de67538178c3ed07c931e5f8fb60849b8111c3f07"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833246 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"6b7ea8626bddf0947a6929d715c64bbadf4eccc528c9e9ac527e662555f2ab85"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833258 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerDied","Data":"550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833272 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"0714d8c339d81fe37d65f8b61284fb17442521338c0d1beb9a6cde0e4b83dcaa"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833286 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"ecf333f033fb5f8af44f74367011135c5c68151c236ed2fb6c9deb690a21c615"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833299 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"fd5126a03583a9e60c4f08ab94ff3e4d6dff99b77efc94559f88151386831a39"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833310 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833322 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"d44dad492e3736c612049c8b048068de134aee1a61264b8715dac1a1505eb90d"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833337 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"d191b57d0995c3a104c3336c01e2a5bd2bc868dba6a6fcca53d04e312b18c0c9"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833356 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"befef1f27ec31a7dce800c4fe3b217c928cd2c29d212afb9d75ef9e969b32b96"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833368 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833379 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"b30a30d243315200a6f03be3c0553cf1e0283ee13ed3b826cd4d8aa9d7481e81"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833391 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"89033761971c21121ad0eb89f27a17b463a2b2ad814a0f77f8444c0013b9927d"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833401 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"f3b5610345e0a05c927b635b9b59c02c0bd317dc652790faf73852f8095009c9"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833411 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"d4087ecceb78b95c5961d00b583ffbdd19fde6d2e05194469b5beb565e8c4e58"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833422 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"6d3d33e94c6f769c3d4f30283e26a8ebfb068648191bff388aba17779108057c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833434 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerDied","Data":"193ee1ad3e7ee183f1ea38494d7735760027689afd79629a8d160747a2494f67"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833453 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"7ffe9984ab39638ad7730b79c49181e26ef0a2e2748c84910693d2353db0a811"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833465 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerDied","Data":"59561622c420df151d8043e444eaec7dca0c22e244b1a6ac8880f20fe809e5c4"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833478 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" event={"ID":"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8","Type":"ContainerDied","Data":"33fc3458349b78bc19c8b30395e299c49cdfbf37f7e541929fe27fba4fc59440"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833491 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"2dd513c4c7700ec665cd85658968cfa47ab585f4855779f0285e2f319e1b23ec"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833503 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerDied","Data":"9936bd164d7a83dfd6c86c4312838d63181895add63b7d1de35a090b8b7d369b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833515 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" event={"ID":"7ed25861-1328-45e7-922e-37588a0b019c","Type":"ContainerDied","Data":"b15d5b3401a95a50f5c18b6410300731cd922d460a927b29c822856e4c00523b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833527 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerDied","Data":"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833537 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833545 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833551 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833561 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" event={"ID":"5decce88-c71e-411c-87b5-a37dd0f77e7b","Type":"ContainerDied","Data":"ce3971a00b14ee7d8820c7e2ce38f070172641049e39dce3eb3a076d83a464ea"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833573 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833586 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833598 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerDied","Data":"6f8d03455884710e737b779ab993de7b077a6712d61dd531eb926a20dcac48c1"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833609 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" event={"ID":"0d4e4f88-7106-4a46-8b63-053345922fb0","Type":"ContainerDied","Data":"2f3d798fc128d08f2b78c16a96552eb1af844c024c5ff08c6a9c3b2ad0da6b71"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833620 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerDied","Data":"0714d8c339d81fe37d65f8b61284fb17442521338c0d1beb9a6cde0e4b83dcaa"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833630 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833637 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833645 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"ecf333f033fb5f8af44f74367011135c5c68151c236ed2fb6c9deb690a21c615"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833654 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833661 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833669 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerDied","Data":"d44dad492e3736c612049c8b048068de134aee1a61264b8715dac1a1505eb90d"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833679 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833687 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833698 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerDied","Data":"d191b57d0995c3a104c3336c01e2a5bd2bc868dba6a6fcca53d04e312b18c0c9"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833707 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833715 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833724 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerDied","Data":"d4087ecceb78b95c5961d00b583ffbdd19fde6d2e05194469b5beb565e8c4e58"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833734 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833742 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833751 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerDied","Data":"6d3d33e94c6f769c3d4f30283e26a8ebfb068648191bff388aba17779108057c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833769 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833796 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833808 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"6b7ea8626bddf0947a6929d715c64bbadf4eccc528c9e9ac527e662555f2ab85"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833818 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833827 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833837 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerDied","Data":"b30a30d243315200a6f03be3c0553cf1e0283ee13ed3b826cd4d8aa9d7481e81"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833848 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833857 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833866 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerDied","Data":"f3b5610345e0a05c927b635b9b59c02c0bd317dc652790faf73852f8095009c9"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833876 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833884 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833894 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerDied","Data":"fd5126a03583a9e60c4f08ab94ff3e4d6dff99b77efc94559f88151386831a39"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833904 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833913 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833922 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerDied","Data":"89033761971c21121ad0eb89f27a17b463a2b2ad814a0f77f8444c0013b9927d"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833932 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833940 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833968 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" event={"ID":"b8709c6c-8729-4702-a3fb-35a072855096","Type":"ContainerDied","Data":"f74560024271b473d288e14ac60c9ecd05f2a6752be21eac89b4a74e35f9a5d8"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833982 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerDied","Data":"befef1f27ec31a7dce800c4fe3b217c928cd2c29d212afb9d75ef9e969b32b96"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.833992 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834002 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834011 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834026 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834037 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"2bc34dd3df75f29672c73e791045d1e82bca7040b7e6a8728aa43a5fe5c90f24"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834048 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834062 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834073 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" event={"ID":"433c3273-c99e-4d68-befc-06f92d2fc8d5","Type":"ContainerStarted","Data":"ce746b6ddef271e1d96a05e1154da574f5881072098465b14e72e5c7eb209ecd"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834085 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" event={"ID":"7ed25861-1328-45e7-922e-37588a0b019c","Type":"ContainerStarted","Data":"54b0b0288807b4af02d2960f12e91660ba1555b54019fdc88271a4d95139f18a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834098 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834111 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834128 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" event={"ID":"5decce88-c71e-411c-87b5-a37dd0f77e7b","Type":"ContainerStarted","Data":"cedf724c8d88e49537acb023168c19d8828e3b366c579b4483751a3578b116f6"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834140 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834152 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerStarted","Data":"6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834164 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerStarted","Data":"5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834177 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"ac900c0e3bb1d9c962bbb16a701da09c17b23c0e09631a6ada5617d6d0661d7b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834189 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" event={"ID":"b8709c6c-8729-4702-a3fb-35a072855096","Type":"ContainerStarted","Data":"31a065365f002cc4975c79d46bbaecf6fa29671ce14c89ee80c0e33219e93dbf"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834201 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" event={"ID":"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8","Type":"ContainerStarted","Data":"e3535a1d997ab09e4a278f3f938e8ede6d64b38d7a883c878fb38cee28d66811"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834212 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834227 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834239 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834250 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834262 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" event={"ID":"0d4e4f88-7106-4a46-8b63-053345922fb0","Type":"ContainerStarted","Data":"eb911dd1a21a84357ddeab75081be9604a2b72ead2988dbb5077ea467b4515c8"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834273 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834283 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834294 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerStarted","Data":"956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834306 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834322 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834334 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"7134dda62a594c58ac76c0bee69ff785ac0ff610ef7d3c4df129e50bb11aec80"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834345 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"d4df7bcfbcc85bcadd5d89c40467a0c62a261fe9df1907801d9e1c35e6fc353d"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834355 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"0e716c369f30bcb4fd885d5df2bfefde9afaf605da0247b8ed1b0e099f4fccca"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834366 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"230fc5938de0c5a6e2516202d99d270da453c1967a8773858a25118455179d5a"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834378 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"ed490ef4ea8d419c12fbad3b98e447ddc9c1f2075c437754f8b557557383a2df"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834390 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"2bc34dd3df75f29672c73e791045d1e82bca7040b7e6a8728aa43a5fe5c90f24"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834402 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834413 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerDied","Data":"8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834426 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834434 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.834444 9368 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b"} Dec 03 20:06:15.836254 master-0 kubenswrapper[9368]: I1203 20:06:15.835576 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.837291 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.837335 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.837510 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.837549 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.840230 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.840269 9368 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="de7a9bf7-4f63-4341-8d40-435cbd00733f" Dec 03 20:06:15.845680 master-0 kubenswrapper[9368]: I1203 20:06:15.845022 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 20:06:15.851019 master-0 kubenswrapper[9368]: I1203 20:06:15.850947 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 20:06:15.851019 master-0 kubenswrapper[9368]: I1203 20:06:15.850986 9368 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="de7a9bf7-4f63-4341-8d40-435cbd00733f" Dec 03 20:06:15.865809 master-0 kubenswrapper[9368]: I1203 20:06:15.851898 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" podStartSLOduration=543.432149735 podStartE2EDuration="9m47.851873269s" podCreationTimestamp="2025-12-03 19:56:28 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.697907445 +0000 UTC m=+64.359157356" lastFinishedPulling="2025-12-03 19:57:23.117630979 +0000 UTC m=+108.778880890" observedRunningTime="2025-12-03 20:06:15.78471436 +0000 UTC m=+641.445964321" watchObservedRunningTime="2025-12-03 20:06:15.851873269 +0000 UTC m=+641.513123200" Dec 03 20:06:15.865809 master-0 kubenswrapper[9368]: I1203 20:06:15.855510 9368 scope.go:117] "RemoveContainer" containerID="8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842" Dec 03 20:06:15.865809 master-0 kubenswrapper[9368]: E1203 20:06:15.855688 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-s29k7_openshift-cluster-storage-operator(367c2c7c-1fc8-4608-aa94-b64c6c70cc61)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" podUID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" Dec 03 20:06:15.865809 master-0 kubenswrapper[9368]: I1203 20:06:15.857714 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" podStartSLOduration=543.239912817 podStartE2EDuration="9m47.857695871s" podCreationTimestamp="2025-12-03 19:56:28 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.50005604 +0000 UTC m=+64.161305951" lastFinishedPulling="2025-12-03 19:57:23.117839094 +0000 UTC m=+108.779089005" observedRunningTime="2025-12-03 20:06:15.825222185 +0000 UTC m=+641.486472116" watchObservedRunningTime="2025-12-03 20:06:15.857695871 +0000 UTC m=+641.518945792" Dec 03 20:06:15.866229 master-0 kubenswrapper[9368]: I1203 20:06:15.866154 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 20:06:15.866229 master-0 kubenswrapper[9368]: I1203 20:06:15.866191 9368 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="abdc84bb-4998-43bb-90da-4358a6c3137d" Dec 03 20:06:15.869010 master-0 kubenswrapper[9368]: I1203 20:06:15.868854 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 20:06:15.869010 master-0 kubenswrapper[9368]: I1203 20:06:15.868905 9368 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="abdc84bb-4998-43bb-90da-4358a6c3137d" Dec 03 20:06:15.892574 master-0 kubenswrapper[9368]: I1203 20:06:15.892533 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99"] Dec 03 20:06:15.907902 master-0 kubenswrapper[9368]: I1203 20:06:15.906075 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-npd99"] Dec 03 20:06:15.928258 master-0 kubenswrapper[9368]: I1203 20:06:15.928214 9368 scope.go:117] "RemoveContainer" containerID="7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" Dec 03 20:06:15.948795 master-0 kubenswrapper[9368]: I1203 20:06:15.945918 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" podStartSLOduration=545.274465708 podStartE2EDuration="9m49.945900836s" podCreationTimestamp="2025-12-03 19:56:26 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.311013708 +0000 UTC m=+63.972263619" lastFinishedPulling="2025-12-03 19:57:22.982448836 +0000 UTC m=+108.643698747" observedRunningTime="2025-12-03 20:06:15.942952824 +0000 UTC m=+641.604202765" watchObservedRunningTime="2025-12-03 20:06:15.945900836 +0000 UTC m=+641.607150747" Dec 03 20:06:16.004515 master-0 kubenswrapper[9368]: I1203 20:06:15.999444 9368 scope.go:117] "RemoveContainer" containerID="739763509f2115327ad6e763bd6fe98f715c6203046f1bff98acddc694d4b998" Dec 03 20:06:16.055999 master-0 kubenswrapper[9368]: I1203 20:06:16.055956 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" podStartSLOduration=553.374007764 podStartE2EDuration="9m57.055939127s" podCreationTimestamp="2025-12-03 19:56:19 +0000 UTC" firstStartedPulling="2025-12-03 19:56:37.355090318 +0000 UTC m=+63.016340249" lastFinishedPulling="2025-12-03 19:57:21.037021701 +0000 UTC m=+106.698271612" observedRunningTime="2025-12-03 20:06:16.054198705 +0000 UTC m=+641.715448616" watchObservedRunningTime="2025-12-03 20:06:16.055939127 +0000 UTC m=+641.717189038" Dec 03 20:06:16.057422 master-0 kubenswrapper[9368]: I1203 20:06:16.057407 9368 scope.go:117] "RemoveContainer" containerID="00ef38cb5e4574cde1559c4f74b2af2d1020f41ece0ea48de28dfccd34cbb389" Dec 03 20:06:16.084173 master-0 kubenswrapper[9368]: I1203 20:06:16.084133 9368 scope.go:117] "RemoveContainer" containerID="92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322" Dec 03 20:06:16.093508 master-0 kubenswrapper[9368]: I1203 20:06:16.093454 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" podStartSLOduration=546.852994558 podStartE2EDuration="9m51.093441008s" podCreationTimestamp="2025-12-03 19:56:25 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.877047665 +0000 UTC m=+64.538297576" lastFinishedPulling="2025-12-03 19:57:23.117494105 +0000 UTC m=+108.778744026" observedRunningTime="2025-12-03 20:06:16.089650384 +0000 UTC m=+641.750900315" watchObservedRunningTime="2025-12-03 20:06:16.093441008 +0000 UTC m=+641.754690919" Dec 03 20:06:16.122495 master-0 kubenswrapper[9368]: I1203 20:06:16.122458 9368 scope.go:117] "RemoveContainer" containerID="95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8" Dec 03 20:06:16.152059 master-0 kubenswrapper[9368]: I1203 20:06:16.152036 9368 scope.go:117] "RemoveContainer" containerID="46ead743a71c6c2931e92ae425d4f75d1fb17286150d55d4a739c7296e0b2be0" Dec 03 20:06:16.164262 master-0 kubenswrapper[9368]: I1203 20:06:16.164204 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" podStartSLOduration=545.361208924 podStartE2EDuration="9m50.164183993s" podCreationTimestamp="2025-12-03 19:56:26 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.179836306 +0000 UTC m=+63.841086217" lastFinishedPulling="2025-12-03 19:57:22.982811375 +0000 UTC m=+108.644061286" observedRunningTime="2025-12-03 20:06:16.144219424 +0000 UTC m=+641.805469335" watchObservedRunningTime="2025-12-03 20:06:16.164183993 +0000 UTC m=+641.825433904" Dec 03 20:06:16.166332 master-0 kubenswrapper[9368]: I1203 20:06:16.166295 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podStartSLOduration=560.166288266 podStartE2EDuration="9m20.166288266s" podCreationTimestamp="2025-12-03 19:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:06:16.164544592 +0000 UTC m=+641.825794503" watchObservedRunningTime="2025-12-03 20:06:16.166288266 +0000 UTC m=+641.827538177" Dec 03 20:06:16.173503 master-0 kubenswrapper[9368]: I1203 20:06:16.172385 9368 scope.go:117] "RemoveContainer" containerID="654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee" Dec 03 20:06:16.191670 master-0 kubenswrapper[9368]: I1203 20:06:16.189550 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sp868" podStartSLOduration=534.844767644 podStartE2EDuration="10m11.189521255s" podCreationTimestamp="2025-12-03 19:56:05 +0000 UTC" firstStartedPulling="2025-12-03 19:56:06.991558862 +0000 UTC m=+32.652808773" lastFinishedPulling="2025-12-03 19:57:23.336312473 +0000 UTC m=+108.997562384" observedRunningTime="2025-12-03 20:06:16.187854755 +0000 UTC m=+641.849104676" watchObservedRunningTime="2025-12-03 20:06:16.189521255 +0000 UTC m=+641.850771166" Dec 03 20:06:16.198341 master-0 kubenswrapper[9368]: I1203 20:06:16.198299 9368 scope.go:117] "RemoveContainer" containerID="c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063" Dec 03 20:06:16.246107 master-0 kubenswrapper[9368]: I1203 20:06:16.241215 9368 scope.go:117] "RemoveContainer" containerID="744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2" Dec 03 20:06:16.272767 master-0 kubenswrapper[9368]: I1203 20:06:16.272729 9368 scope.go:117] "RemoveContainer" containerID="fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38" Dec 03 20:06:16.273056 master-0 kubenswrapper[9368]: I1203 20:06:16.272987 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mc8kx" podStartSLOduration=535.174527437 podStartE2EDuration="10m9.272961953s" podCreationTimestamp="2025-12-03 19:56:07 +0000 UTC" firstStartedPulling="2025-12-03 19:56:09.025628947 +0000 UTC m=+34.686878858" lastFinishedPulling="2025-12-03 19:57:23.124063453 +0000 UTC m=+108.785313374" observedRunningTime="2025-12-03 20:06:16.270020561 +0000 UTC m=+641.931270482" watchObservedRunningTime="2025-12-03 20:06:16.272961953 +0000 UTC m=+641.934211854" Dec 03 20:06:16.294330 master-0 kubenswrapper[9368]: I1203 20:06:16.294294 9368 scope.go:117] "RemoveContainer" containerID="16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6" Dec 03 20:06:16.295806 master-0 kubenswrapper[9368]: I1203 20:06:16.295737 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" podStartSLOduration=536.292731993 podStartE2EDuration="9m40.295716842s" podCreationTimestamp="2025-12-03 19:56:36 +0000 UTC" firstStartedPulling="2025-12-03 19:56:39.22802433 +0000 UTC m=+64.889274241" lastFinishedPulling="2025-12-03 19:57:23.231009179 +0000 UTC m=+108.892259090" observedRunningTime="2025-12-03 20:06:16.294939543 +0000 UTC m=+641.956189464" watchObservedRunningTime="2025-12-03 20:06:16.295716842 +0000 UTC m=+641.956966753" Dec 03 20:06:16.321188 master-0 kubenswrapper[9368]: I1203 20:06:16.320342 9368 scope.go:117] "RemoveContainer" containerID="b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f" Dec 03 20:06:16.323610 master-0 kubenswrapper[9368]: I1203 20:06:16.323520 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r2c8x" podStartSLOduration=535.048051783 podStartE2EDuration="10m10.323499624s" podCreationTimestamp="2025-12-03 19:56:06 +0000 UTC" firstStartedPulling="2025-12-03 19:56:08.018017854 +0000 UTC m=+33.679267805" lastFinishedPulling="2025-12-03 19:57:23.293465735 +0000 UTC m=+108.954715646" observedRunningTime="2025-12-03 20:06:16.321174776 +0000 UTC m=+641.982424687" watchObservedRunningTime="2025-12-03 20:06:16.323499624 +0000 UTC m=+641.984749555" Dec 03 20:06:16.344130 master-0 kubenswrapper[9368]: I1203 20:06:16.344087 9368 scope.go:117] "RemoveContainer" containerID="49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75" Dec 03 20:06:16.350974 master-0 kubenswrapper[9368]: I1203 20:06:16.348841 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6zrxk" podStartSLOduration=535.097357513 podStartE2EDuration="10m8.348825135s" podCreationTimestamp="2025-12-03 19:56:08 +0000 UTC" firstStartedPulling="2025-12-03 19:56:10.041549611 +0000 UTC m=+35.702799522" lastFinishedPulling="2025-12-03 19:57:23.293017243 +0000 UTC m=+108.954267144" observedRunningTime="2025-12-03 20:06:16.348334683 +0000 UTC m=+642.009584594" watchObservedRunningTime="2025-12-03 20:06:16.348825135 +0000 UTC m=+642.010075036" Dec 03 20:06:16.383925 master-0 kubenswrapper[9368]: I1203 20:06:16.376633 9368 scope.go:117] "RemoveContainer" containerID="1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341" Dec 03 20:06:16.400808 master-0 kubenswrapper[9368]: I1203 20:06:16.400758 9368 scope.go:117] "RemoveContainer" containerID="d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839" Dec 03 20:06:16.426618 master-0 kubenswrapper[9368]: I1203 20:06:16.426492 9368 scope.go:117] "RemoveContainer" containerID="81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12" Dec 03 20:06:16.428439 master-0 kubenswrapper[9368]: I1203 20:06:16.427716 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/2.log" Dec 03 20:06:16.434289 master-0 kubenswrapper[9368]: I1203 20:06:16.434248 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" event={"ID":"1c22cb59-5083-4be6-9998-a9e67a2c20cd","Type":"ContainerStarted","Data":"263c9892d0db3a282cd4fd76feedfc7a2f00079133490560ffda1c5aceb719de"} Dec 03 20:06:16.434289 master-0 kubenswrapper[9368]: I1203 20:06:16.434289 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" event={"ID":"1c22cb59-5083-4be6-9998-a9e67a2c20cd","Type":"ContainerStarted","Data":"bb807fb004e1c5a8c12ce908fa4f2effefa5e62f25142bb2fe3ec8dd74d140f1"} Dec 03 20:06:16.439984 master-0 kubenswrapper[9368]: I1203 20:06:16.434411 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:06:16.439984 master-0 kubenswrapper[9368]: I1203 20:06:16.436632 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/3.log" Dec 03 20:06:16.439984 master-0 kubenswrapper[9368]: I1203 20:06:16.439641 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:06:16.439984 master-0 kubenswrapper[9368]: I1203 20:06:16.439976 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:06:16.441305 master-0 kubenswrapper[9368]: I1203 20:06:16.441284 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/2.log" Dec 03 20:06:16.449235 master-0 kubenswrapper[9368]: I1203 20:06:16.449203 9368 scope.go:117] "RemoveContainer" containerID="12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201" Dec 03 20:06:16.451053 master-0 kubenswrapper[9368]: I1203 20:06:16.450975 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" exitCode=0 Dec 03 20:06:16.451053 master-0 kubenswrapper[9368]: I1203 20:06:16.451030 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81"} Dec 03 20:06:16.451146 master-0 kubenswrapper[9368]: I1203 20:06:16.451069 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c"} Dec 03 20:06:16.452663 master-0 kubenswrapper[9368]: I1203 20:06:16.452620 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/3.log" Dec 03 20:06:16.454373 master-0 kubenswrapper[9368]: I1203 20:06:16.454344 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/2.log" Dec 03 20:06:16.457269 master-0 kubenswrapper[9368]: I1203 20:06:16.457234 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/2.log" Dec 03 20:06:16.474663 master-0 kubenswrapper[9368]: I1203 20:06:16.474573 9368 scope.go:117] "RemoveContainer" containerID="4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8" Dec 03 20:06:16.505094 master-0 kubenswrapper[9368]: I1203 20:06:16.499666 9368 scope.go:117] "RemoveContainer" containerID="341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229" Dec 03 20:06:16.508661 master-0 kubenswrapper[9368]: I1203 20:06:16.508618 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 20:06:16.511133 master-0 kubenswrapper[9368]: I1203 20:06:16.511101 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d689b5b-brqck"] Dec 03 20:06:16.520665 master-0 kubenswrapper[9368]: I1203 20:06:16.520424 9368 scope.go:117] "RemoveContainer" containerID="3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766" Dec 03 20:06:16.527713 master-0 kubenswrapper[9368]: I1203 20:06:16.527658 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" podStartSLOduration=544.400704277 podStartE2EDuration="9m48.527647454s" podCreationTimestamp="2025-12-03 19:56:28 +0000 UTC" firstStartedPulling="2025-12-03 19:56:38.85555479 +0000 UTC m=+64.516804701" lastFinishedPulling="2025-12-03 19:57:22.982497927 +0000 UTC m=+108.643747878" observedRunningTime="2025-12-03 20:06:16.527016888 +0000 UTC m=+642.188266819" watchObservedRunningTime="2025-12-03 20:06:16.527647454 +0000 UTC m=+642.188897365" Dec 03 20:06:16.542462 master-0 kubenswrapper[9368]: I1203 20:06:16.541983 9368 scope.go:117] "RemoveContainer" containerID="e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5" Dec 03 20:06:16.553343 master-0 kubenswrapper[9368]: I1203 20:06:16.553305 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" path="/var/lib/kubelet/pods/61b16a8a-27a2-4a07-b5f9-10a5be2ec870/volumes" Dec 03 20:06:16.554221 master-0 kubenswrapper[9368]: I1203 20:06:16.554188 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" path="/var/lib/kubelet/pods/b5cad72f-5bbf-42fc-9d63-545a01c98cbe/volumes" Dec 03 20:06:16.570279 master-0 kubenswrapper[9368]: I1203 20:06:16.570244 9368 scope.go:117] "RemoveContainer" containerID="c45b306077a652b23f0900eb2cbb0416939e7dc4bb4d4fe2ac8622e1b6c0da5a" Dec 03 20:06:16.592126 master-0 kubenswrapper[9368]: I1203 20:06:16.592088 9368 scope.go:117] "RemoveContainer" containerID="441d867492f0f10ece1761a5339bcd749dc935547bbd2edddb84af4fe04b1249" Dec 03 20:06:16.613451 master-0 kubenswrapper[9368]: I1203 20:06:16.613414 9368 scope.go:117] "RemoveContainer" containerID="8c7c6a085ea6bbbf2982572791af9a9759a9fc311f8df3506418406ff3e1f36a" Dec 03 20:06:16.647563 master-0 kubenswrapper[9368]: I1203 20:06:16.645136 9368 scope.go:117] "RemoveContainer" containerID="0acf9557821accd587e8bd9912ad989c059f24ef17109b73584eca0d899729a7" Dec 03 20:06:16.670820 master-0 kubenswrapper[9368]: I1203 20:06:16.670753 9368 scope.go:117] "RemoveContainer" containerID="1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c" Dec 03 20:06:16.692510 master-0 kubenswrapper[9368]: I1203 20:06:16.692454 9368 scope.go:117] "RemoveContainer" containerID="5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45" Dec 03 20:06:16.715909 master-0 kubenswrapper[9368]: I1203 20:06:16.715853 9368 scope.go:117] "RemoveContainer" containerID="0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b" Dec 03 20:06:16.751895 master-0 kubenswrapper[9368]: I1203 20:06:16.751844 9368 scope.go:117] "RemoveContainer" containerID="1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c" Dec 03 20:06:16.752459 master-0 kubenswrapper[9368]: E1203 20:06:16.752407 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c\": container with ID starting with 1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c not found: ID does not exist" containerID="1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c" Dec 03 20:06:16.752513 master-0 kubenswrapper[9368]: I1203 20:06:16.752471 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c"} err="failed to get container status \"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c\": rpc error: code = NotFound desc = could not find container \"1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c\": container with ID starting with 1bde03f53f1aba9b728bfefdb85dd63f6a5517b4c8a0343559f64c1f03ce4e3c not found: ID does not exist" Dec 03 20:06:16.752555 master-0 kubenswrapper[9368]: I1203 20:06:16.752509 9368 scope.go:117] "RemoveContainer" containerID="5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45" Dec 03 20:06:16.753229 master-0 kubenswrapper[9368]: E1203 20:06:16.753187 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45\": container with ID starting with 5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45 not found: ID does not exist" containerID="5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45" Dec 03 20:06:16.753314 master-0 kubenswrapper[9368]: I1203 20:06:16.753246 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45"} err="failed to get container status \"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45\": rpc error: code = NotFound desc = could not find container \"5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45\": container with ID starting with 5992edcac541fa77269930de1a02dd784ce5397190135d38e4719fad6d964b45 not found: ID does not exist" Dec 03 20:06:16.753314 master-0 kubenswrapper[9368]: I1203 20:06:16.753287 9368 scope.go:117] "RemoveContainer" containerID="0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b" Dec 03 20:06:16.753594 master-0 kubenswrapper[9368]: E1203 20:06:16.753558 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b\": container with ID starting with 0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b not found: ID does not exist" containerID="0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b" Dec 03 20:06:16.753639 master-0 kubenswrapper[9368]: I1203 20:06:16.753597 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b"} err="failed to get container status \"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b\": rpc error: code = NotFound desc = could not find container \"0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b\": container with ID starting with 0e21d1a78f01b2c86e8a517177c7568f6695fa81ca18975759c979beb59d6b4b not found: ID does not exist" Dec 03 20:06:16.753639 master-0 kubenswrapper[9368]: I1203 20:06:16.753625 9368 scope.go:117] "RemoveContainer" containerID="92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322" Dec 03 20:06:16.754035 master-0 kubenswrapper[9368]: E1203 20:06:16.754003 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322\": container with ID starting with 92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322 not found: ID does not exist" containerID="92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322" Dec 03 20:06:16.754085 master-0 kubenswrapper[9368]: I1203 20:06:16.754045 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322"} err="failed to get container status \"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322\": rpc error: code = NotFound desc = could not find container \"92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322\": container with ID starting with 92f773171332fe1047c615595f130c77be75d242dc97e2d6092f29a8d7898322 not found: ID does not exist" Dec 03 20:06:16.754085 master-0 kubenswrapper[9368]: I1203 20:06:16.754072 9368 scope.go:117] "RemoveContainer" containerID="95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8" Dec 03 20:06:16.754368 master-0 kubenswrapper[9368]: E1203 20:06:16.754333 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8\": container with ID starting with 95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8 not found: ID does not exist" containerID="95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8" Dec 03 20:06:16.754441 master-0 kubenswrapper[9368]: I1203 20:06:16.754374 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8"} err="failed to get container status \"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8\": rpc error: code = NotFound desc = could not find container \"95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8\": container with ID starting with 95d0ca3a853fd9f93e01c67870d1d4d269549c7560c451b67830fa1b176c7eb8 not found: ID does not exist" Dec 03 20:06:16.754441 master-0 kubenswrapper[9368]: I1203 20:06:16.754400 9368 scope.go:117] "RemoveContainer" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:06:16.754693 master-0 kubenswrapper[9368]: E1203 20:06:16.754663 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b\": container with ID starting with 90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b not found: ID does not exist" containerID="90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b" Dec 03 20:06:16.754752 master-0 kubenswrapper[9368]: I1203 20:06:16.754700 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b"} err="failed to get container status \"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b\": rpc error: code = NotFound desc = could not find container \"90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b\": container with ID starting with 90564517af04049d6ec0e898c2ae0505288ea36bcc26e8b87f6cfddbd789cf9b not found: ID does not exist" Dec 03 20:06:16.754752 master-0 kubenswrapper[9368]: I1203 20:06:16.754725 9368 scope.go:117] "RemoveContainer" containerID="7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" Dec 03 20:06:16.755007 master-0 kubenswrapper[9368]: E1203 20:06:16.754976 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630\": container with ID starting with 7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630 not found: ID does not exist" containerID="7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630" Dec 03 20:06:16.755065 master-0 kubenswrapper[9368]: I1203 20:06:16.755012 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630"} err="failed to get container status \"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630\": rpc error: code = NotFound desc = could not find container \"7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630\": container with ID starting with 7def324ef495c1e55c8e9233ccd93d3408c35454ff9a9bc3bac5d21a48173630 not found: ID does not exist" Dec 03 20:06:16.755065 master-0 kubenswrapper[9368]: I1203 20:06:16.755039 9368 scope.go:117] "RemoveContainer" containerID="b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f" Dec 03 20:06:16.755324 master-0 kubenswrapper[9368]: E1203 20:06:16.755293 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f\": container with ID starting with b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f not found: ID does not exist" containerID="b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f" Dec 03 20:06:16.755383 master-0 kubenswrapper[9368]: I1203 20:06:16.755331 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f"} err="failed to get container status \"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f\": rpc error: code = NotFound desc = could not find container \"b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f\": container with ID starting with b85781bda41437456bdf7f25de8ffb27da808b560ac58497b8aae4bf24b2109f not found: ID does not exist" Dec 03 20:06:16.755383 master-0 kubenswrapper[9368]: I1203 20:06:16.755357 9368 scope.go:117] "RemoveContainer" containerID="49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75" Dec 03 20:06:16.755752 master-0 kubenswrapper[9368]: E1203 20:06:16.755702 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75\": container with ID starting with 49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75 not found: ID does not exist" containerID="49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75" Dec 03 20:06:16.755831 master-0 kubenswrapper[9368]: I1203 20:06:16.755744 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75"} err="failed to get container status \"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75\": rpc error: code = NotFound desc = could not find container \"49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75\": container with ID starting with 49a13ebc694f26cd89010ddce04800eb4f4c986f75a07318bd04a364d89d8c75 not found: ID does not exist" Dec 03 20:06:16.755831 master-0 kubenswrapper[9368]: I1203 20:06:16.755772 9368 scope.go:117] "RemoveContainer" containerID="81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.756253 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12\": container with ID starting with 81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12 not found: ID does not exist" containerID="81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.757466 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12"} err="failed to get container status \"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12\": rpc error: code = NotFound desc = could not find container \"81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12\": container with ID starting with 81434316be96a2b14a22680f8e4bee888b65b3e97cc5bc7df607a91a047bac12 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.757495 9368 scope.go:117] "RemoveContainer" containerID="12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.759043 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201\": container with ID starting with 12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201 not found: ID does not exist" containerID="12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.759089 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201"} err="failed to get container status \"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201\": rpc error: code = NotFound desc = could not find container \"12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201\": container with ID starting with 12ba33f367264d50b59a4676b1e61bc0a6d45703296fe265553724b4dbafb201 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.759124 9368 scope.go:117] "RemoveContainer" containerID="1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.759496 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341\": container with ID starting with 1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341 not found: ID does not exist" containerID="1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.759524 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341"} err="failed to get container status \"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341\": rpc error: code = NotFound desc = could not find container \"1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341\": container with ID starting with 1b19da259a1027d9535ea67f48d90da4b466d1543b7cd71e10242c0f818c0341 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.759549 9368 scope.go:117] "RemoveContainer" containerID="d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.759947 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839\": container with ID starting with d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839 not found: ID does not exist" containerID="d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.759986 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839"} err="failed to get container status \"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839\": rpc error: code = NotFound desc = could not find container \"d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839\": container with ID starting with d547bb93c93c86e1c0269c4fb32a10d62340ebafe98b4ab6c6927fd1a6493839 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.760031 9368 scope.go:117] "RemoveContainer" containerID="4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.760329 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8\": container with ID starting with 4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8 not found: ID does not exist" containerID="4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.760362 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8"} err="failed to get container status \"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8\": rpc error: code = NotFound desc = could not find container \"4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8\": container with ID starting with 4379513206c1639335bb05ee8287de982289e551bc3d4966f636dc8340b2ecf8 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.760390 9368 scope.go:117] "RemoveContainer" containerID="341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: E1203 20:06:16.760692 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229\": container with ID starting with 341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229 not found: ID does not exist" containerID="341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.760733 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229"} err="failed to get container status \"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229\": rpc error: code = NotFound desc = could not find container \"341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229\": container with ID starting with 341b601d165e50b66bc27ce1e4916e7fa4ef52e1059015c14333dc841ef12229 not found: ID does not exist" Dec 03 20:06:16.760855 master-0 kubenswrapper[9368]: I1203 20:06:16.760767 9368 scope.go:117] "RemoveContainer" containerID="654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee" Dec 03 20:06:16.761440 master-0 kubenswrapper[9368]: E1203 20:06:16.761122 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee\": container with ID starting with 654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee not found: ID does not exist" containerID="654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee" Dec 03 20:06:16.761440 master-0 kubenswrapper[9368]: I1203 20:06:16.761154 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee"} err="failed to get container status \"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee\": rpc error: code = NotFound desc = could not find container \"654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee\": container with ID starting with 654ce27fb70f480beba5ca8af4a5c2faaea9183cad789692159d1b32739ab7ee not found: ID does not exist" Dec 03 20:06:16.761440 master-0 kubenswrapper[9368]: I1203 20:06:16.761177 9368 scope.go:117] "RemoveContainer" containerID="c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063" Dec 03 20:06:16.761440 master-0 kubenswrapper[9368]: E1203 20:06:16.761422 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063\": container with ID starting with c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063 not found: ID does not exist" containerID="c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063" Dec 03 20:06:16.761554 master-0 kubenswrapper[9368]: I1203 20:06:16.761463 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063"} err="failed to get container status \"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063\": rpc error: code = NotFound desc = could not find container \"c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063\": container with ID starting with c773db2a1877d6932c57258d42f5b394294b213e87c30f8a0a0e8aca67ad0063 not found: ID does not exist" Dec 03 20:06:16.761554 master-0 kubenswrapper[9368]: I1203 20:06:16.761496 9368 scope.go:117] "RemoveContainer" containerID="744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2" Dec 03 20:06:16.761940 master-0 kubenswrapper[9368]: E1203 20:06:16.761815 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2\": container with ID starting with 744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2 not found: ID does not exist" containerID="744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2" Dec 03 20:06:16.761940 master-0 kubenswrapper[9368]: I1203 20:06:16.761857 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2"} err="failed to get container status \"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2\": rpc error: code = NotFound desc = could not find container \"744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2\": container with ID starting with 744faadce32102a4f51bf311e6ce0b868fa1346e51cfebbdff76ea1eb3693fe2 not found: ID does not exist" Dec 03 20:06:16.761940 master-0 kubenswrapper[9368]: I1203 20:06:16.761881 9368 scope.go:117] "RemoveContainer" containerID="3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766" Dec 03 20:06:16.762302 master-0 kubenswrapper[9368]: E1203 20:06:16.762271 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766\": container with ID starting with 3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766 not found: ID does not exist" containerID="3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766" Dec 03 20:06:16.762351 master-0 kubenswrapper[9368]: I1203 20:06:16.762310 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766"} err="failed to get container status \"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766\": rpc error: code = NotFound desc = could not find container \"3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766\": container with ID starting with 3bc3b0513399a4dbff7b1cc288d3bbd1c2bcb7799b1e463991c6f6704c28e766 not found: ID does not exist" Dec 03 20:06:16.762387 master-0 kubenswrapper[9368]: I1203 20:06:16.762347 9368 scope.go:117] "RemoveContainer" containerID="e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5" Dec 03 20:06:16.762675 master-0 kubenswrapper[9368]: E1203 20:06:16.762640 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5\": container with ID starting with e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5 not found: ID does not exist" containerID="e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5" Dec 03 20:06:16.762719 master-0 kubenswrapper[9368]: I1203 20:06:16.762676 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5"} err="failed to get container status \"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5\": rpc error: code = NotFound desc = could not find container \"e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5\": container with ID starting with e8e3349c588e032c5b231176a3b6f6e0b37f7441a8fa80e01747aaded424ead5 not found: ID does not exist" Dec 03 20:06:16.762719 master-0 kubenswrapper[9368]: I1203 20:06:16.762700 9368 scope.go:117] "RemoveContainer" containerID="fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38" Dec 03 20:06:16.766150 master-0 kubenswrapper[9368]: E1203 20:06:16.763378 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38\": container with ID starting with fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38 not found: ID does not exist" containerID="fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38" Dec 03 20:06:16.766150 master-0 kubenswrapper[9368]: I1203 20:06:16.764097 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38"} err="failed to get container status \"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38\": rpc error: code = NotFound desc = could not find container \"fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38\": container with ID starting with fd46fba44017fc29b42f47a162580459a77d137ccc3daa28e90491a52f6c5e38 not found: ID does not exist" Dec 03 20:06:16.766150 master-0 kubenswrapper[9368]: I1203 20:06:16.764127 9368 scope.go:117] "RemoveContainer" containerID="16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6" Dec 03 20:06:16.767735 master-0 kubenswrapper[9368]: E1203 20:06:16.767695 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6\": container with ID starting with 16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6 not found: ID does not exist" containerID="16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6" Dec 03 20:06:16.767847 master-0 kubenswrapper[9368]: I1203 20:06:16.767731 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6"} err="failed to get container status \"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6\": rpc error: code = NotFound desc = could not find container \"16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6\": container with ID starting with 16f863a99a7b4db6f75ba856ee48509b29d62e76913caec7ed378fa26c23b8d6 not found: ID does not exist" Dec 03 20:06:16.767847 master-0 kubenswrapper[9368]: I1203 20:06:16.767752 9368 scope.go:117] "RemoveContainer" containerID="ecf333f033fb5f8af44f74367011135c5c68151c236ed2fb6c9deb690a21c615" Dec 03 20:06:16.854865 master-0 kubenswrapper[9368]: I1203 20:06:16.852712 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:17.292467 master-0 kubenswrapper[9368]: I1203 20:06:17.292397 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz"] Dec 03 20:06:17.296742 master-0 kubenswrapper[9368]: I1203 20:06:17.296684 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-psrtz"] Dec 03 20:06:17.469045 master-0 kubenswrapper[9368]: I1203 20:06:17.468955 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/3.log" Dec 03 20:06:17.471881 master-0 kubenswrapper[9368]: I1203 20:06:17.471831 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/3.log" Dec 03 20:06:17.475166 master-0 kubenswrapper[9368]: I1203 20:06:17.475107 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/3.log" Dec 03 20:06:17.478402 master-0 kubenswrapper[9368]: I1203 20:06:17.478345 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/3.log" Dec 03 20:06:17.481380 master-0 kubenswrapper[9368]: I1203 20:06:17.481328 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:06:17.484288 master-0 kubenswrapper[9368]: I1203 20:06:17.484242 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/3.log" Dec 03 20:06:17.487091 master-0 kubenswrapper[9368]: I1203 20:06:17.487055 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/3.log" Dec 03 20:06:17.491438 master-0 kubenswrapper[9368]: I1203 20:06:17.491397 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290"} Dec 03 20:06:17.522751 master-0 kubenswrapper[9368]: I1203 20:06:17.522642 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podStartSLOduration=581.522613703 podStartE2EDuration="9m41.522613703s" podCreationTimestamp="2025-12-03 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:06:17.311752078 +0000 UTC m=+642.973002039" watchObservedRunningTime="2025-12-03 20:06:17.522613703 +0000 UTC m=+643.183863644" Dec 03 20:06:17.852868 master-0 kubenswrapper[9368]: I1203 20:06:17.852696 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:17.853115 master-0 kubenswrapper[9368]: I1203 20:06:17.852871 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:18.558077 master-0 kubenswrapper[9368]: I1203 20:06:18.557977 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" path="/var/lib/kubelet/pods/61ca5373-413c-4824-ba19-13b99c3081e4/volumes" Dec 03 20:06:18.607682 master-0 kubenswrapper[9368]: I1203 20:06:18.607595 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:18.664185 master-0 kubenswrapper[9368]: I1203 20:06:18.664080 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:18.664185 master-0 kubenswrapper[9368]: I1203 20:06:18.664165 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:18.853529 master-0 kubenswrapper[9368]: I1203 20:06:18.853358 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:18.853529 master-0 kubenswrapper[9368]: I1203 20:06:18.853442 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:19.125073 master-0 kubenswrapper[9368]: I1203 20:06:19.124965 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:20.726018 master-0 kubenswrapper[9368]: E1203 20:06:20.725840 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" is forbidden: the server was unable to return a response in the time allotted, but may still be processing the request (get limitranges)" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:20.853541 master-0 kubenswrapper[9368]: I1203 20:06:20.853457 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:20.853893 master-0 kubenswrapper[9368]: I1203 20:06:20.853563 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:21.408997 master-0 kubenswrapper[9368]: I1203 20:06:21.408865 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:06:21.413563 master-0 kubenswrapper[9368]: I1203 20:06:21.413508 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:06:21.664299 master-0 kubenswrapper[9368]: I1203 20:06:21.664061 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:21.664299 master-0 kubenswrapper[9368]: I1203 20:06:21.664156 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:21.918391 master-0 kubenswrapper[9368]: I1203 20:06:21.918192 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:22.845599 master-0 kubenswrapper[9368]: I1203 20:06:22.845482 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:23.003897 master-0 kubenswrapper[9368]: E1203 20:06:23.003813 9368 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:23.003897 master-0 kubenswrapper[9368]: E1203 20:06:23.003864 9368 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:06:23.464286 master-0 kubenswrapper[9368]: I1203 20:06:23.464198 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:23.464666 master-0 kubenswrapper[9368]: I1203 20:06:23.464303 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:23.853374 master-0 kubenswrapper[9368]: I1203 20:06:23.853209 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:23.853374 master-0 kubenswrapper[9368]: I1203 20:06:23.853309 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:24.125022 master-0 kubenswrapper[9368]: I1203 20:06:24.124917 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:24.160246 master-0 kubenswrapper[9368]: I1203 20:06:24.160188 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:24.663178 master-0 kubenswrapper[9368]: I1203 20:06:24.663103 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:24.663469 master-0 kubenswrapper[9368]: I1203 20:06:24.663213 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:24.663469 master-0 kubenswrapper[9368]: I1203 20:06:24.663292 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:24.664189 master-0 kubenswrapper[9368]: I1203 20:06:24.664133 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 20:06:24.664313 master-0 kubenswrapper[9368]: I1203 20:06:24.664208 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" containerID="cri-o://e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623" gracePeriod=30 Dec 03 20:06:24.677065 master-0 kubenswrapper[9368]: I1203 20:06:24.676976 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:37368->10.128.0.17:8443: read: connection reset by peer" start-of-body= Dec 03 20:06:24.677281 master-0 kubenswrapper[9368]: I1203 20:06:24.677073 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:37368->10.128.0.17:8443: read: connection reset by peer" Dec 03 20:06:24.918521 master-0 kubenswrapper[9368]: I1203 20:06:24.918285 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:25.563817 master-0 kubenswrapper[9368]: I1203 20:06:25.563687 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/2.log" Dec 03 20:06:25.575550 master-0 kubenswrapper[9368]: I1203 20:06:25.575461 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623" exitCode=255 Dec 03 20:06:25.576293 master-0 kubenswrapper[9368]: I1203 20:06:25.576197 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623"} Dec 03 20:06:25.576485 master-0 kubenswrapper[9368]: I1203 20:06:25.576335 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382"} Dec 03 20:06:25.576485 master-0 kubenswrapper[9368]: I1203 20:06:25.576420 9368 scope.go:117] "RemoveContainer" containerID="2dd513c4c7700ec665cd85658968cfa47ab585f4855779f0285e2f319e1b23ec" Dec 03 20:06:25.576813 master-0 kubenswrapper[9368]: I1203 20:06:25.576713 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:26.248988 master-0 kubenswrapper[9368]: I1203 20:06:26.248862 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:26.255901 master-0 kubenswrapper[9368]: I1203 20:06:26.255848 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:26.587035 master-0 kubenswrapper[9368]: I1203 20:06:26.586849 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/2.log" Dec 03 20:06:26.594925 master-0 kubenswrapper[9368]: I1203 20:06:26.594864 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:27.337401 master-0 kubenswrapper[9368]: I1203 20:06:27.337312 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:06:27.337900 master-0 kubenswrapper[9368]: I1203 20:06:27.337400 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:06:27.337900 master-0 kubenswrapper[9368]: I1203 20:06:27.337469 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:06:27.338297 master-0 kubenswrapper[9368]: I1203 20:06:27.338234 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161"} pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 20:06:27.338394 master-0 kubenswrapper[9368]: I1203 20:06:27.338346 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" containerID="cri-o://cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161" gracePeriod=600 Dec 03 20:06:27.544240 master-0 kubenswrapper[9368]: I1203 20:06:27.544198 9368 scope.go:117] "RemoveContainer" containerID="8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842" Dec 03 20:06:27.544493 master-0 kubenswrapper[9368]: E1203 20:06:27.544414 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-s29k7_openshift-cluster-storage-operator(367c2c7c-1fc8-4608-aa94-b64c6c70cc61)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" podUID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" Dec 03 20:06:27.601635 master-0 kubenswrapper[9368]: I1203 20:06:27.601588 9368 generic.go:334] "Generic (PLEG): container finished" podID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerID="cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161" exitCode=0 Dec 03 20:06:27.602605 master-0 kubenswrapper[9368]: I1203 20:06:27.601631 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerDied","Data":"cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161"} Dec 03 20:06:27.602605 master-0 kubenswrapper[9368]: I1203 20:06:27.601718 9368 scope.go:117] "RemoveContainer" containerID="550fa2508090ec9228e5344d14eb3903d47f1fd24e235f6122c95a9e089d9e56" Dec 03 20:06:28.614437 master-0 kubenswrapper[9368]: I1203 20:06:28.614337 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a"} Dec 03 20:06:28.703354 master-0 kubenswrapper[9368]: E1203 20:06:28.703220 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:06:28.862458 master-0 kubenswrapper[9368]: E1203 20:06:28.862388 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:06:29.161382 master-0 kubenswrapper[9368]: I1203 20:06:29.161305 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:29.853952 master-0 kubenswrapper[9368]: I1203 20:06:29.853763 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:29.854831 master-0 kubenswrapper[9368]: I1203 20:06:29.853950 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:30.664539 master-0 kubenswrapper[9368]: I1203 20:06:30.664410 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:30.664539 master-0 kubenswrapper[9368]: I1203 20:06:30.664514 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:32.854354 master-0 kubenswrapper[9368]: I1203 20:06:32.854275 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:32.854946 master-0 kubenswrapper[9368]: I1203 20:06:32.854375 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:33.464284 master-0 kubenswrapper[9368]: I1203 20:06:33.464201 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:33.464760 master-0 kubenswrapper[9368]: I1203 20:06:33.464295 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:33.663653 master-0 kubenswrapper[9368]: I1203 20:06:33.663537 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:33.663653 master-0 kubenswrapper[9368]: I1203 20:06:33.663641 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:34.542167 master-0 kubenswrapper[9368]: E1203 20:06:34.542067 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 20:06:34.920028 master-0 kubenswrapper[9368]: I1203 20:06:34.919924 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:35.853520 master-0 kubenswrapper[9368]: I1203 20:06:35.853389 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:35.853520 master-0 kubenswrapper[9368]: I1203 20:06:35.853504 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:36.664285 master-0 kubenswrapper[9368]: I1203 20:06:36.664163 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:36.664285 master-0 kubenswrapper[9368]: I1203 20:06:36.664264 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:36.664703 master-0 kubenswrapper[9368]: I1203 20:06:36.664316 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:06:36.665124 master-0 kubenswrapper[9368]: I1203 20:06:36.665049 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 20:06:36.665257 master-0 kubenswrapper[9368]: I1203 20:06:36.665127 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" containerID="cri-o://da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" gracePeriod=30 Dec 03 20:06:36.678441 master-0 kubenswrapper[9368]: I1203 20:06:36.678367 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:42204->10.128.0.17:8443: read: connection reset by peer" start-of-body= Dec 03 20:06:36.678441 master-0 kubenswrapper[9368]: I1203 20:06:36.678424 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:42204->10.128.0.17:8443: read: connection reset by peer" Dec 03 20:06:36.781959 master-0 kubenswrapper[9368]: E1203 20:06:36.781895 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:06:37.682545 master-0 kubenswrapper[9368]: I1203 20:06:37.682440 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/3.log" Dec 03 20:06:37.683504 master-0 kubenswrapper[9368]: I1203 20:06:37.683478 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/2.log" Dec 03 20:06:37.684268 master-0 kubenswrapper[9368]: I1203 20:06:37.684199 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" exitCode=255 Dec 03 20:06:37.684413 master-0 kubenswrapper[9368]: I1203 20:06:37.684276 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382"} Dec 03 20:06:37.684413 master-0 kubenswrapper[9368]: I1203 20:06:37.684341 9368 scope.go:117] "RemoveContainer" containerID="e4d111ea4bb5f2834fb95352ff94c389e71a98b14756480233a487fdada83623" Dec 03 20:06:37.685215 master-0 kubenswrapper[9368]: I1203 20:06:37.685156 9368 scope.go:117] "RemoveContainer" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" Dec 03 20:06:37.685573 master-0 kubenswrapper[9368]: E1203 20:06:37.685506 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:06:38.695230 master-0 kubenswrapper[9368]: I1203 20:06:38.695181 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/3.log" Dec 03 20:06:39.544814 master-0 kubenswrapper[9368]: I1203 20:06:39.544707 9368 scope.go:117] "RemoveContainer" containerID="8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842" Dec 03 20:06:39.709502 master-0 kubenswrapper[9368]: I1203 20:06:39.709380 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/3.log" Dec 03 20:06:39.710318 master-0 kubenswrapper[9368]: I1203 20:06:39.709493 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" event={"ID":"367c2c7c-1fc8-4608-aa94-b64c6c70cc61","Type":"ContainerStarted","Data":"7b8c5cc822c68f255818bf9db29ff6c6999f5727fccaeecd63599c6adbdd8185"} Dec 03 20:06:40.691739 master-0 kubenswrapper[9368]: I1203 20:06:40.691622 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 03 20:06:43.464053 master-0 kubenswrapper[9368]: I1203 20:06:43.463894 9368 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-mfwhz container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:06:43.464053 master-0 kubenswrapper[9368]: I1203 20:06:43.464017 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:43.465397 master-0 kubenswrapper[9368]: I1203 20:06:43.464090 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:06:43.465397 master-0 kubenswrapper[9368]: I1203 20:06:43.465212 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9"} pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Dec 03 20:06:43.465397 master-0 kubenswrapper[9368]: I1203 20:06:43.465268 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerName="authentication-operator" containerID="cri-o://9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9" gracePeriod=30 Dec 03 20:06:43.741060 master-0 kubenswrapper[9368]: I1203 20:06:43.740990 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/4.log" Dec 03 20:06:43.741744 master-0 kubenswrapper[9368]: I1203 20:06:43.741673 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/3.log" Dec 03 20:06:43.741968 master-0 kubenswrapper[9368]: I1203 20:06:43.741744 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9" exitCode=255 Dec 03 20:06:43.741968 master-0 kubenswrapper[9368]: I1203 20:06:43.741807 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9"} Dec 03 20:06:43.741968 master-0 kubenswrapper[9368]: I1203 20:06:43.741857 9368 scope.go:117] "RemoveContainer" containerID="6b7ea8626bddf0947a6929d715c64bbadf4eccc528c9e9ac527e662555f2ab85" Dec 03 20:06:44.605800 master-0 kubenswrapper[9368]: I1203 20:06:44.605668 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=4.605649985 podStartE2EDuration="4.605649985s" podCreationTimestamp="2025-12-03 20:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:06:44.603535394 +0000 UTC m=+670.264785345" watchObservedRunningTime="2025-12-03 20:06:44.605649985 +0000 UTC m=+670.266899916" Dec 03 20:06:44.753492 master-0 kubenswrapper[9368]: I1203 20:06:44.753401 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/4.log" Dec 03 20:06:44.753837 master-0 kubenswrapper[9368]: I1203 20:06:44.753507 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d"} Dec 03 20:06:44.920139 master-0 kubenswrapper[9368]: I1203 20:06:44.920037 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:44.920307 master-0 kubenswrapper[9368]: I1203 20:06:44.920179 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:44.921069 master-0 kubenswrapper[9368]: I1203 20:06:44.921016 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 03 20:06:44.921160 master-0 kubenswrapper[9368]: I1203 20:06:44.921092 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" containerID="cri-o://2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290" gracePeriod=30 Dec 03 20:06:45.704624 master-0 kubenswrapper[9368]: E1203 20:06:45.704448 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:06:45.767452 master-0 kubenswrapper[9368]: I1203 20:06:45.767355 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290" exitCode=255 Dec 03 20:06:45.767736 master-0 kubenswrapper[9368]: I1203 20:06:45.767496 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290"} Dec 03 20:06:45.767736 master-0 kubenswrapper[9368]: I1203 20:06:45.767581 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967"} Dec 03 20:06:45.767736 master-0 kubenswrapper[9368]: I1203 20:06:45.767615 9368 scope.go:117] "RemoveContainer" containerID="2bc34dd3df75f29672c73e791045d1e82bca7040b7e6a8728aa43a5fe5c90f24" Dec 03 20:06:45.820348 master-0 kubenswrapper[9368]: I1203 20:06:45.820262 9368 scope.go:117] "RemoveContainer" containerID="23c2b742ed78624af8a87bafdac0a226661dbc177a2ddfac515be738b044bdfc" Dec 03 20:06:46.777442 master-0 kubenswrapper[9368]: I1203 20:06:46.777379 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/3.log" Dec 03 20:06:46.778555 master-0 kubenswrapper[9368]: I1203 20:06:46.778483 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/2.log" Dec 03 20:06:46.778681 master-0 kubenswrapper[9368]: I1203 20:06:46.778593 9368 generic.go:334] "Generic (PLEG): container finished" podID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" exitCode=255 Dec 03 20:06:46.778681 master-0 kubenswrapper[9368]: I1203 20:06:46.778654 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerDied","Data":"f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424"} Dec 03 20:06:46.778859 master-0 kubenswrapper[9368]: I1203 20:06:46.778741 9368 scope.go:117] "RemoveContainer" containerID="b30a30d243315200a6f03be3c0553cf1e0283ee13ed3b826cd4d8aa9d7481e81" Dec 03 20:06:46.779356 master-0 kubenswrapper[9368]: I1203 20:06:46.779299 9368 scope.go:117] "RemoveContainer" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" Dec 03 20:06:46.779732 master-0 kubenswrapper[9368]: E1203 20:06:46.779668 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-storage-operator pod=cluster-storage-operator-f84784664-wnl8p_openshift-cluster-storage-operator(f749c7f2-1fd7-4078-a92d-0ae5523998ac)\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" podUID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" Dec 03 20:06:46.783016 master-0 kubenswrapper[9368]: I1203 20:06:46.782959 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-4fzrl_f9f99422-7991-40ef-92a1-de2e603e47b9/cluster-olm-operator/2.log" Dec 03 20:06:46.785113 master-0 kubenswrapper[9368]: I1203 20:06:46.785045 9368 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901" exitCode=255 Dec 03 20:06:46.785241 master-0 kubenswrapper[9368]: I1203 20:06:46.785146 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerDied","Data":"5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901"} Dec 03 20:06:46.785949 master-0 kubenswrapper[9368]: I1203 20:06:46.785881 9368 scope.go:117] "RemoveContainer" containerID="5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901" Dec 03 20:06:46.786243 master-0 kubenswrapper[9368]: E1203 20:06:46.786201 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-4fzrl_openshift-cluster-olm-operator(f9f99422-7991-40ef-92a1-de2e603e47b9)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" podUID="f9f99422-7991-40ef-92a1-de2e603e47b9" Dec 03 20:06:46.788380 master-0 kubenswrapper[9368]: I1203 20:06:46.788337 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/4.log" Dec 03 20:06:46.789201 master-0 kubenswrapper[9368]: I1203 20:06:46.789149 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/3.log" Dec 03 20:06:46.789323 master-0 kubenswrapper[9368]: I1203 20:06:46.789223 9368 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" exitCode=255 Dec 03 20:06:46.789419 master-0 kubenswrapper[9368]: I1203 20:06:46.789344 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerDied","Data":"abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b"} Dec 03 20:06:46.789982 master-0 kubenswrapper[9368]: I1203 20:06:46.789943 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:06:46.790656 master-0 kubenswrapper[9368]: E1203 20:06:46.790611 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:06:46.791946 master-0 kubenswrapper[9368]: I1203 20:06:46.791896 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/4.log" Dec 03 20:06:46.792834 master-0 kubenswrapper[9368]: I1203 20:06:46.792724 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/3.log" Dec 03 20:06:46.793000 master-0 kubenswrapper[9368]: I1203 20:06:46.792844 9368 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" exitCode=255 Dec 03 20:06:46.793000 master-0 kubenswrapper[9368]: I1203 20:06:46.792942 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerDied","Data":"89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e"} Dec 03 20:06:46.793456 master-0 kubenswrapper[9368]: I1203 20:06:46.793423 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:06:46.793732 master-0 kubenswrapper[9368]: E1203 20:06:46.793689 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:06:46.795316 master-0 kubenswrapper[9368]: I1203 20:06:46.795281 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/3.log" Dec 03 20:06:46.796077 master-0 kubenswrapper[9368]: I1203 20:06:46.796018 9368 generic.go:334] "Generic (PLEG): container finished" podID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" exitCode=255 Dec 03 20:06:46.796199 master-0 kubenswrapper[9368]: I1203 20:06:46.796130 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerDied","Data":"9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd"} Dec 03 20:06:46.797062 master-0 kubenswrapper[9368]: I1203 20:06:46.796934 9368 scope.go:117] "RemoveContainer" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" Dec 03 20:06:46.797387 master-0 kubenswrapper[9368]: E1203 20:06:46.797342 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-mqpzf_openshift-etcd-operator(78a864f2-934f-4197-9753-24c9bc7f1fca)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" Dec 03 20:06:46.804874 master-0 kubenswrapper[9368]: I1203 20:06:46.804819 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/4.log" Dec 03 20:06:46.806277 master-0 kubenswrapper[9368]: I1203 20:06:46.805966 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/3.log" Dec 03 20:06:46.806277 master-0 kubenswrapper[9368]: I1203 20:06:46.806122 9368 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" exitCode=255 Dec 03 20:06:46.806277 master-0 kubenswrapper[9368]: I1203 20:06:46.806217 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerDied","Data":"c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23"} Dec 03 20:06:46.807324 master-0 kubenswrapper[9368]: I1203 20:06:46.807040 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:06:46.807537 master-0 kubenswrapper[9368]: E1203 20:06:46.807476 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:06:46.815039 master-0 kubenswrapper[9368]: I1203 20:06:46.814954 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/4.log" Dec 03 20:06:46.816118 master-0 kubenswrapper[9368]: I1203 20:06:46.816051 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/3.log" Dec 03 20:06:46.816237 master-0 kubenswrapper[9368]: I1203 20:06:46.816152 9368 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" exitCode=255 Dec 03 20:06:46.816414 master-0 kubenswrapper[9368]: I1203 20:06:46.816351 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerDied","Data":"728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e"} Dec 03 20:06:46.818158 master-0 kubenswrapper[9368]: I1203 20:06:46.818108 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:06:46.820745 master-0 kubenswrapper[9368]: E1203 20:06:46.820685 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:06:46.821695 master-0 kubenswrapper[9368]: I1203 20:06:46.821623 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/4.log" Dec 03 20:06:46.825898 master-0 kubenswrapper[9368]: I1203 20:06:46.825836 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/3.log" Dec 03 20:06:46.826047 master-0 kubenswrapper[9368]: I1203 20:06:46.825949 9368 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" exitCode=255 Dec 03 20:06:46.826172 master-0 kubenswrapper[9368]: I1203 20:06:46.826080 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerDied","Data":"2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec"} Dec 03 20:06:46.828397 master-0 kubenswrapper[9368]: I1203 20:06:46.828330 9368 scope.go:117] "RemoveContainer" containerID="9936bd164d7a83dfd6c86c4312838d63181895add63b7d1de35a090b8b7d369b" Dec 03 20:06:46.829312 master-0 kubenswrapper[9368]: I1203 20:06:46.829252 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/1.log" Dec 03 20:06:46.829970 master-0 kubenswrapper[9368]: I1203 20:06:46.829432 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:06:46.830472 master-0 kubenswrapper[9368]: I1203 20:06:46.830412 9368 generic.go:334] "Generic (PLEG): container finished" podID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" containerID="956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5" exitCode=255 Dec 03 20:06:46.830619 master-0 kubenswrapper[9368]: I1203 20:06:46.830535 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerDied","Data":"956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5"} Dec 03 20:06:46.831737 master-0 kubenswrapper[9368]: I1203 20:06:46.831682 9368 scope.go:117] "RemoveContainer" containerID="956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5" Dec 03 20:06:46.832171 master-0 kubenswrapper[9368]: E1203 20:06:46.832099 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=csi-snapshot-controller-operator pod=csi-snapshot-controller-operator-7b795784b8-4gppw_openshift-cluster-storage-operator(b84835e3-e8bc-4aa4-a8f3-f9be702a358a)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" podUID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" Dec 03 20:06:46.833562 master-0 kubenswrapper[9368]: I1203 20:06:46.833509 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/4.log" Dec 03 20:06:46.834081 master-0 kubenswrapper[9368]: I1203 20:06:46.834052 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/3.log" Dec 03 20:06:46.834182 master-0 kubenswrapper[9368]: I1203 20:06:46.834103 9368 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" exitCode=255 Dec 03 20:06:46.834287 master-0 kubenswrapper[9368]: I1203 20:06:46.834221 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerDied","Data":"db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3"} Dec 03 20:06:46.834864 master-0 kubenswrapper[9368]: I1203 20:06:46.834729 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:06:46.834990 master-0 kubenswrapper[9368]: E1203 20:06:46.830561 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:06:46.835073 master-0 kubenswrapper[9368]: E1203 20:06:46.835056 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:06:46.836683 master-0 kubenswrapper[9368]: I1203 20:06:46.836644 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/3.log" Dec 03 20:06:46.837409 master-0 kubenswrapper[9368]: I1203 20:06:46.837352 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/2.log" Dec 03 20:06:46.837539 master-0 kubenswrapper[9368]: I1203 20:06:46.837431 9368 generic.go:334] "Generic (PLEG): container finished" podID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" exitCode=255 Dec 03 20:06:46.837615 master-0 kubenswrapper[9368]: I1203 20:06:46.837522 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerDied","Data":"0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0"} Dec 03 20:06:46.837958 master-0 kubenswrapper[9368]: I1203 20:06:46.837919 9368 scope.go:117] "RemoveContainer" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" Dec 03 20:06:46.838149 master-0 kubenswrapper[9368]: E1203 20:06:46.838110 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-j2wgx_openshift-kube-scheduler-operator(5b3ee9a2-0f17-4a04-9191-b60684ef6c29)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" podUID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" Dec 03 20:06:46.843360 master-0 kubenswrapper[9368]: I1203 20:06:46.843316 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/1.log" Dec 03 20:06:46.844102 master-0 kubenswrapper[9368]: I1203 20:06:46.844038 9368 generic.go:334] "Generic (PLEG): container finished" podID="63e3d36d-1676-4f90-ac9a-d85b861a4655" containerID="6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212" exitCode=255 Dec 03 20:06:46.844298 master-0 kubenswrapper[9368]: I1203 20:06:46.844136 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerDied","Data":"6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212"} Dec 03 20:06:46.845103 master-0 kubenswrapper[9368]: I1203 20:06:46.845026 9368 scope.go:117] "RemoveContainer" containerID="6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212" Dec 03 20:06:46.845498 master-0 kubenswrapper[9368]: E1203 20:06:46.845437 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-controller pod=service-ca-6b8bb995f7-bj4vz_openshift-service-ca(63e3d36d-1676-4f90-ac9a-d85b861a4655)\"" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" podUID="63e3d36d-1676-4f90-ac9a-d85b861a4655" Dec 03 20:06:46.846838 master-0 kubenswrapper[9368]: I1203 20:06:46.846763 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/4.log" Dec 03 20:06:46.847591 master-0 kubenswrapper[9368]: I1203 20:06:46.847530 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/3.log" Dec 03 20:06:46.847708 master-0 kubenswrapper[9368]: I1203 20:06:46.847641 9368 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" exitCode=255 Dec 03 20:06:46.847811 master-0 kubenswrapper[9368]: I1203 20:06:46.847706 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerDied","Data":"64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c"} Dec 03 20:06:46.848880 master-0 kubenswrapper[9368]: I1203 20:06:46.848816 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:06:46.849418 master-0 kubenswrapper[9368]: E1203 20:06:46.849313 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:06:46.884425 master-0 kubenswrapper[9368]: I1203 20:06:46.884376 9368 scope.go:117] "RemoveContainer" containerID="d191b57d0995c3a104c3336c01e2a5bd2bc868dba6a6fcca53d04e312b18c0c9" Dec 03 20:06:46.925083 master-0 kubenswrapper[9368]: I1203 20:06:46.925016 9368 scope.go:117] "RemoveContainer" containerID="d44dad492e3736c612049c8b048068de134aee1a61264b8715dac1a1505eb90d" Dec 03 20:06:46.960728 master-0 kubenswrapper[9368]: I1203 20:06:46.960687 9368 scope.go:117] "RemoveContainer" containerID="6f8d03455884710e737b779ab993de7b077a6712d61dd531eb926a20dcac48c1" Dec 03 20:06:47.003712 master-0 kubenswrapper[9368]: I1203 20:06:47.003657 9368 scope.go:117] "RemoveContainer" containerID="89033761971c21121ad0eb89f27a17b463a2b2ad814a0f77f8444c0013b9927d" Dec 03 20:06:47.032238 master-0 kubenswrapper[9368]: I1203 20:06:47.032177 9368 scope.go:117] "RemoveContainer" containerID="f3b5610345e0a05c927b635b9b59c02c0bd317dc652790faf73852f8095009c9" Dec 03 20:06:47.066220 master-0 kubenswrapper[9368]: I1203 20:06:47.066148 9368 scope.go:117] "RemoveContainer" containerID="6d3d33e94c6f769c3d4f30283e26a8ebfb068648191bff388aba17779108057c" Dec 03 20:06:47.093496 master-0 kubenswrapper[9368]: I1203 20:06:47.093435 9368 scope.go:117] "RemoveContainer" containerID="193ee1ad3e7ee183f1ea38494d7735760027689afd79629a8d160747a2494f67" Dec 03 20:06:47.113840 master-0 kubenswrapper[9368]: I1203 20:06:47.113815 9368 scope.go:117] "RemoveContainer" containerID="d4087ecceb78b95c5961d00b583ffbdd19fde6d2e05194469b5beb565e8c4e58" Dec 03 20:06:47.130789 master-0 kubenswrapper[9368]: I1203 20:06:47.130726 9368 scope.go:117] "RemoveContainer" containerID="fbb527c9a5f9ae83b24668268584afb30442540a16ac4e78c92bdf23a3df3b8c" Dec 03 20:06:47.157392 master-0 kubenswrapper[9368]: I1203 20:06:47.157265 9368 scope.go:117] "RemoveContainer" containerID="59561622c420df151d8043e444eaec7dca0c22e244b1a6ac8880f20fe809e5c4" Dec 03 20:06:47.184153 master-0 kubenswrapper[9368]: I1203 20:06:47.184116 9368 scope.go:117] "RemoveContainer" containerID="fd5126a03583a9e60c4f08ab94ff3e4d6dff99b77efc94559f88151386831a39" Dec 03 20:06:47.858103 master-0 kubenswrapper[9368]: I1203 20:06:47.857945 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/3.log" Dec 03 20:06:47.860955 master-0 kubenswrapper[9368]: I1203 20:06:47.860901 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/1.log" Dec 03 20:06:47.863142 master-0 kubenswrapper[9368]: I1203 20:06:47.863087 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/4.log" Dec 03 20:06:47.865322 master-0 kubenswrapper[9368]: I1203 20:06:47.865259 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/4.log" Dec 03 20:06:47.867570 master-0 kubenswrapper[9368]: I1203 20:06:47.867500 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/4.log" Dec 03 20:06:47.869926 master-0 kubenswrapper[9368]: I1203 20:06:47.869868 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/3.log" Dec 03 20:06:47.872448 master-0 kubenswrapper[9368]: I1203 20:06:47.872384 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/4.log" Dec 03 20:06:47.875074 master-0 kubenswrapper[9368]: I1203 20:06:47.875012 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/1.log" Dec 03 20:06:47.877412 master-0 kubenswrapper[9368]: I1203 20:06:47.877351 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/4.log" Dec 03 20:06:47.879738 master-0 kubenswrapper[9368]: I1203 20:06:47.879674 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/4.log" Dec 03 20:06:47.881962 master-0 kubenswrapper[9368]: I1203 20:06:47.881906 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/4.log" Dec 03 20:06:47.885110 master-0 kubenswrapper[9368]: I1203 20:06:47.885031 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/3.log" Dec 03 20:06:47.887771 master-0 kubenswrapper[9368]: I1203 20:06:47.887718 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-4fzrl_f9f99422-7991-40ef-92a1-de2e603e47b9/cluster-olm-operator/2.log" Dec 03 20:06:48.608145 master-0 kubenswrapper[9368]: I1203 20:06:48.608030 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:48.721124 master-0 kubenswrapper[9368]: E1203 20:06:48.720922 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-operators-6zrxk.187dccd4c9fcd9b8 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-6zrxk,UID:af6f6483-5ca1-48b7-90b5-b03d460d041a,APIVersion:v1,ResourceVersion:7188,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\" in 43.813s (43.813s including waiting). Image size: 912736453 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.293026744 +0000 UTC m=+108.954276655,LastTimestamp:2025-12-03 19:57:23.293026744 +0000 UTC m=+108.954276655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:06:51.544452 master-0 kubenswrapper[9368]: I1203 20:06:51.544335 9368 scope.go:117] "RemoveContainer" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" Dec 03 20:06:51.545421 master-0 kubenswrapper[9368]: E1203 20:06:51.544835 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:06:51.918536 master-0 kubenswrapper[9368]: I1203 20:06:51.918448 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:06:54.919111 master-0 kubenswrapper[9368]: I1203 20:06:54.918923 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:06:57.544405 master-0 kubenswrapper[9368]: I1203 20:06:57.544308 9368 scope.go:117] "RemoveContainer" containerID="5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901" Dec 03 20:06:57.545515 master-0 kubenswrapper[9368]: I1203 20:06:57.544659 9368 scope.go:117] "RemoveContainer" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" Dec 03 20:06:57.545515 master-0 kubenswrapper[9368]: E1203 20:06:57.544679 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-4fzrl_openshift-cluster-olm-operator(f9f99422-7991-40ef-92a1-de2e603e47b9)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" podUID="f9f99422-7991-40ef-92a1-de2e603e47b9" Dec 03 20:06:57.545515 master-0 kubenswrapper[9368]: E1203 20:06:57.545043 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-mqpzf_openshift-etcd-operator(78a864f2-934f-4197-9753-24c9bc7f1fca)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" Dec 03 20:06:58.545050 master-0 kubenswrapper[9368]: I1203 20:06:58.544953 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:06:58.545978 master-0 kubenswrapper[9368]: I1203 20:06:58.545151 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:06:58.545978 master-0 kubenswrapper[9368]: I1203 20:06:58.545343 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:06:58.545978 master-0 kubenswrapper[9368]: E1203 20:06:58.545374 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:06:58.545978 master-0 kubenswrapper[9368]: E1203 20:06:58.545486 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:06:58.545978 master-0 kubenswrapper[9368]: E1203 20:06:58.545692 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:06:59.543992 master-0 kubenswrapper[9368]: I1203 20:06:59.543906 9368 scope.go:117] "RemoveContainer" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" Dec 03 20:06:59.544247 master-0 kubenswrapper[9368]: E1203 20:06:59.544198 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-storage-operator pod=cluster-storage-operator-f84784664-wnl8p_openshift-cluster-storage-operator(f749c7f2-1fd7-4078-a92d-0ae5523998ac)\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" podUID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" Dec 03 20:06:59.544353 master-0 kubenswrapper[9368]: I1203 20:06:59.544293 9368 scope.go:117] "RemoveContainer" containerID="956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5" Dec 03 20:06:59.544496 master-0 kubenswrapper[9368]: I1203 20:06:59.544455 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:06:59.544639 master-0 kubenswrapper[9368]: I1203 20:06:59.544595 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:06:59.544756 master-0 kubenswrapper[9368]: E1203 20:06:59.544723 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:06:59.545182 master-0 kubenswrapper[9368]: E1203 20:06:59.545160 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:06:59.976569 master-0 kubenswrapper[9368]: I1203 20:06:59.976504 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/1.log" Dec 03 20:06:59.976569 master-0 kubenswrapper[9368]: I1203 20:06:59.976563 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerStarted","Data":"cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01"} Dec 03 20:07:00.545033 master-0 kubenswrapper[9368]: I1203 20:07:00.544960 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:07:00.545247 master-0 kubenswrapper[9368]: I1203 20:07:00.545211 9368 scope.go:117] "RemoveContainer" containerID="6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212" Dec 03 20:07:00.545548 master-0 kubenswrapper[9368]: E1203 20:07:00.545328 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:07:00.990147 master-0 kubenswrapper[9368]: I1203 20:07:00.990044 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/1.log" Dec 03 20:07:00.990147 master-0 kubenswrapper[9368]: I1203 20:07:00.990125 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerStarted","Data":"9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda"} Dec 03 20:07:01.544046 master-0 kubenswrapper[9368]: I1203 20:07:01.543993 9368 scope.go:117] "RemoveContainer" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" Dec 03 20:07:01.544687 master-0 kubenswrapper[9368]: E1203 20:07:01.544647 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-j2wgx_openshift-kube-scheduler-operator(5b3ee9a2-0f17-4a04-9191-b60684ef6c29)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" podUID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" Dec 03 20:07:02.544649 master-0 kubenswrapper[9368]: I1203 20:07:02.544558 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:07:02.545573 master-0 kubenswrapper[9368]: E1203 20:07:02.544924 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:07:02.707023 master-0 kubenswrapper[9368]: E1203 20:07:02.706938 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:07:04.549299 master-0 kubenswrapper[9368]: I1203 20:07:04.549235 9368 scope.go:117] "RemoveContainer" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" Dec 03 20:07:04.550452 master-0 kubenswrapper[9368]: E1203 20:07:04.549574 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:07:04.919265 master-0 kubenswrapper[9368]: I1203 20:07:04.919177 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:07.660116 master-0 kubenswrapper[9368]: I1203 20:07:07.659919 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:07:07.660923 master-0 kubenswrapper[9368]: I1203 20:07:07.660652 9368 scope.go:117] "RemoveContainer" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" Dec 03 20:07:07.661260 master-0 kubenswrapper[9368]: E1203 20:07:07.661200 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-mqpzf_openshift-etcd-operator(78a864f2-934f-4197-9753-24c9bc7f1fca)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" Dec 03 20:07:09.544886 master-0 kubenswrapper[9368]: I1203 20:07:09.544761 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:07:09.545669 master-0 kubenswrapper[9368]: E1203 20:07:09.545149 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:07:10.544826 master-0 kubenswrapper[9368]: I1203 20:07:10.544684 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:07:10.545290 master-0 kubenswrapper[9368]: I1203 20:07:10.544923 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:07:10.545290 master-0 kubenswrapper[9368]: E1203 20:07:10.545065 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:07:10.545290 master-0 kubenswrapper[9368]: I1203 20:07:10.545104 9368 scope.go:117] "RemoveContainer" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" Dec 03 20:07:10.546130 master-0 kubenswrapper[9368]: E1203 20:07:10.545305 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:07:10.546130 master-0 kubenswrapper[9368]: I1203 20:07:10.545439 9368 scope.go:117] "RemoveContainer" containerID="5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901" Dec 03 20:07:10.546130 master-0 kubenswrapper[9368]: E1203 20:07:10.545456 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-storage-operator pod=cluster-storage-operator-f84784664-wnl8p_openshift-cluster-storage-operator(f749c7f2-1fd7-4078-a92d-0ae5523998ac)\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" podUID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" Dec 03 20:07:11.062521 master-0 kubenswrapper[9368]: I1203 20:07:11.062450 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-4fzrl_f9f99422-7991-40ef-92a1-de2e603e47b9/cluster-olm-operator/2.log" Dec 03 20:07:11.064212 master-0 kubenswrapper[9368]: I1203 20:07:11.064114 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" event={"ID":"f9f99422-7991-40ef-92a1-de2e603e47b9","Type":"ContainerStarted","Data":"c99347901e9ecb655c04e665218ca02d83676ca50dfba88f6005bd259789ce67"} Dec 03 20:07:12.544118 master-0 kubenswrapper[9368]: I1203 20:07:12.544010 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:07:12.545004 master-0 kubenswrapper[9368]: I1203 20:07:12.544381 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:07:12.545004 master-0 kubenswrapper[9368]: E1203 20:07:12.544459 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:07:12.545004 master-0 kubenswrapper[9368]: I1203 20:07:12.544720 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:07:12.545004 master-0 kubenswrapper[9368]: E1203 20:07:12.544738 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:07:12.545662 master-0 kubenswrapper[9368]: E1203 20:07:12.545584 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:07:14.919079 master-0 kubenswrapper[9368]: I1203 20:07:14.918903 9368 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:14.919079 master-0 kubenswrapper[9368]: I1203 20:07:14.919031 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:07:14.919961 master-0 kubenswrapper[9368]: I1203 20:07:14.919930 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 03 20:07:14.920039 master-0 kubenswrapper[9368]: I1203 20:07:14.919989 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" containerID="cri-o://46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" gracePeriod=30 Dec 03 20:07:15.043552 master-0 kubenswrapper[9368]: E1203 20:07:15.043466 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:15.094166 master-0 kubenswrapper[9368]: I1203 20:07:15.094097 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" exitCode=255 Dec 03 20:07:15.094464 master-0 kubenswrapper[9368]: I1203 20:07:15.094171 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967"} Dec 03 20:07:15.094464 master-0 kubenswrapper[9368]: I1203 20:07:15.094281 9368 scope.go:117] "RemoveContainer" containerID="2cf827f1b0ff93b50c80872c6d1a48b2d6dcae8bf37fb7372318857b2511c290" Dec 03 20:07:15.095078 master-0 kubenswrapper[9368]: I1203 20:07:15.095020 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:07:15.095533 master-0 kubenswrapper[9368]: E1203 20:07:15.095373 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:15.097003 master-0 kubenswrapper[9368]: I1203 20:07:15.096930 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/5.log" Dec 03 20:07:15.097970 master-0 kubenswrapper[9368]: I1203 20:07:15.097919 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/4.log" Dec 03 20:07:15.098274 master-0 kubenswrapper[9368]: I1203 20:07:15.098023 9368 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" exitCode=255 Dec 03 20:07:15.098274 master-0 kubenswrapper[9368]: I1203 20:07:15.098082 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerDied","Data":"0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d"} Dec 03 20:07:15.098933 master-0 kubenswrapper[9368]: I1203 20:07:15.098843 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:07:15.099318 master-0 kubenswrapper[9368]: E1203 20:07:15.099252 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:07:15.122245 master-0 kubenswrapper[9368]: I1203 20:07:15.121930 9368 scope.go:117] "RemoveContainer" containerID="9e5ef7b3c0490a710149a5e033b19d384ce5d0dfe6bb0ef15f5d72d083cc1ce9" Dec 03 20:07:16.105794 master-0 kubenswrapper[9368]: I1203 20:07:16.105731 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/5.log" Dec 03 20:07:16.525255 master-0 kubenswrapper[9368]: I1203 20:07:16.525144 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:07:16.526072 master-0 kubenswrapper[9368]: I1203 20:07:16.526028 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:07:16.526383 master-0 kubenswrapper[9368]: E1203 20:07:16.526328 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:16.544040 master-0 kubenswrapper[9368]: I1203 20:07:16.543979 9368 scope.go:117] "RemoveContainer" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" Dec 03 20:07:16.544405 master-0 kubenswrapper[9368]: E1203 20:07:16.544337 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-j2wgx_openshift-kube-scheduler-operator(5b3ee9a2-0f17-4a04-9191-b60684ef6c29)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" podUID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" Dec 03 20:07:17.544578 master-0 kubenswrapper[9368]: I1203 20:07:17.544457 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:07:17.545755 master-0 kubenswrapper[9368]: I1203 20:07:17.544944 9368 scope.go:117] "RemoveContainer" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" Dec 03 20:07:17.545755 master-0 kubenswrapper[9368]: E1203 20:07:17.545004 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:07:18.129418 master-0 kubenswrapper[9368]: I1203 20:07:18.129365 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/3.log" Dec 03 20:07:18.130153 master-0 kubenswrapper[9368]: I1203 20:07:18.130103 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7"} Dec 03 20:07:18.130689 master-0 kubenswrapper[9368]: I1203 20:07:18.130599 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:07:18.545372 master-0 kubenswrapper[9368]: I1203 20:07:18.545177 9368 scope.go:117] "RemoveContainer" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" Dec 03 20:07:18.546281 master-0 kubenswrapper[9368]: E1203 20:07:18.545560 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-mqpzf_openshift-etcd-operator(78a864f2-934f-4197-9753-24c9bc7f1fca)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" podUID="78a864f2-934f-4197-9753-24c9bc7f1fca" Dec 03 20:07:19.708864 master-0 kubenswrapper[9368]: E1203 20:07:19.708535 9368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 20:07:21.544019 master-0 kubenswrapper[9368]: I1203 20:07:21.543936 9368 scope.go:117] "RemoveContainer" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" Dec 03 20:07:21.545131 master-0 kubenswrapper[9368]: E1203 20:07:21.544290 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-storage-operator pod=cluster-storage-operator-f84784664-wnl8p_openshift-cluster-storage-operator(f749c7f2-1fd7-4078-a92d-0ae5523998ac)\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" podUID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" Dec 03 20:07:21.664344 master-0 kubenswrapper[9368]: I1203 20:07:21.664259 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:07:21.664645 master-0 kubenswrapper[9368]: I1203 20:07:21.664366 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:22.463317 master-0 kubenswrapper[9368]: I1203 20:07:22.463229 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:07:22.464229 master-0 kubenswrapper[9368]: I1203 20:07:22.464186 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:07:22.464636 master-0 kubenswrapper[9368]: E1203 20:07:22.464591 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:07:22.544802 master-0 kubenswrapper[9368]: I1203 20:07:22.544707 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:07:22.545521 master-0 kubenswrapper[9368]: E1203 20:07:22.545129 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:07:22.723954 master-0 kubenswrapper[9368]: E1203 20:07:22.723632 9368 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{community-operators-r2c8x.187dccd4ca03a376 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-r2c8x,UID:acb1d894-1bc0-478d-87fc-e9137291df70,APIVersion:v1,ResourceVersion:6948,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3c0962dbbad51633a7d97ef253d0249269bfe3bbef3bfe99a99457470e7a682\" in 43.889s (43.889s including waiting). Image size: 912736453 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 19:57:23.293471606 +0000 UTC m=+108.954721507,LastTimestamp:2025-12-03 19:57:23.293471606 +0000 UTC m=+108.954721507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:07:23.544564 master-0 kubenswrapper[9368]: I1203 20:07:23.544471 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:07:23.545022 master-0 kubenswrapper[9368]: E1203 20:07:23.544931 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:07:23.853008 master-0 kubenswrapper[9368]: I1203 20:07:23.852813 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:07:23.853008 master-0 kubenswrapper[9368]: I1203 20:07:23.852924 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:24.543880 master-0 kubenswrapper[9368]: I1203 20:07:24.543761 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:07:24.544650 master-0 kubenswrapper[9368]: E1203 20:07:24.544601 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:07:24.548896 master-0 kubenswrapper[9368]: I1203 20:07:24.548831 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:07:24.549677 master-0 kubenswrapper[9368]: E1203 20:07:24.549244 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:07:24.664077 master-0 kubenswrapper[9368]: I1203 20:07:24.664009 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:07:24.664534 master-0 kubenswrapper[9368]: I1203 20:07:24.664470 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:25.544635 master-0 kubenswrapper[9368]: I1203 20:07:25.544523 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:07:25.544990 master-0 kubenswrapper[9368]: I1203 20:07:25.544763 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:07:25.544990 master-0 kubenswrapper[9368]: E1203 20:07:25.544843 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:07:25.545278 master-0 kubenswrapper[9368]: E1203 20:07:25.545198 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:07:26.853666 master-0 kubenswrapper[9368]: I1203 20:07:26.853551 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:07:26.853666 master-0 kubenswrapper[9368]: I1203 20:07:26.853659 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:27.544176 master-0 kubenswrapper[9368]: I1203 20:07:27.544113 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:07:27.544613 master-0 kubenswrapper[9368]: E1203 20:07:27.544408 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:27.664807 master-0 kubenswrapper[9368]: I1203 20:07:27.664720 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:07:27.665074 master-0 kubenswrapper[9368]: I1203 20:07:27.664824 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:07:27.665074 master-0 kubenswrapper[9368]: I1203 20:07:27.664889 9368 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:07:27.665754 master-0 kubenswrapper[9368]: I1203 20:07:27.665565 9368 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 20:07:27.665754 master-0 kubenswrapper[9368]: I1203 20:07:27.665635 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" containerID="cri-o://30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" gracePeriod=30 Dec 03 20:07:27.679215 master-0 kubenswrapper[9368]: I1203 20:07:27.679133 9368 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-8xmrv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:50032->10.128.0.17:8443: read: connection reset by peer" start-of-body= Dec 03 20:07:27.679215 master-0 kubenswrapper[9368]: I1203 20:07:27.679204 9368 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": read tcp 10.128.0.2:50032->10.128.0.17:8443: read: connection reset by peer" Dec 03 20:07:27.800356 master-0 kubenswrapper[9368]: E1203 20:07:27.800268 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:07:28.196353 master-0 kubenswrapper[9368]: I1203 20:07:28.196270 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/4.log" Dec 03 20:07:28.197893 master-0 kubenswrapper[9368]: I1203 20:07:28.197855 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/3.log" Dec 03 20:07:28.198684 master-0 kubenswrapper[9368]: I1203 20:07:28.198642 9368 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" exitCode=255 Dec 03 20:07:28.198944 master-0 kubenswrapper[9368]: I1203 20:07:28.198767 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerDied","Data":"30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7"} Dec 03 20:07:28.199138 master-0 kubenswrapper[9368]: I1203 20:07:28.199110 9368 scope.go:117] "RemoveContainer" containerID="da7a3994394a55c9298b32c298537581a06cd839f637078bf5012d0ce27db382" Dec 03 20:07:28.200336 master-0 kubenswrapper[9368]: I1203 20:07:28.200273 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:07:28.201234 master-0 kubenswrapper[9368]: E1203 20:07:28.201027 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:07:28.544924 master-0 kubenswrapper[9368]: I1203 20:07:28.544748 9368 scope.go:117] "RemoveContainer" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" Dec 03 20:07:29.208174 master-0 kubenswrapper[9368]: I1203 20:07:29.208066 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/3.log" Dec 03 20:07:29.208993 master-0 kubenswrapper[9368]: I1203 20:07:29.208237 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" event={"ID":"5b3ee9a2-0f17-4a04-9191-b60684ef6c29","Type":"ContainerStarted","Data":"2876828d2f94d1a52c820ea7850e2391ee158409615077a9baac5494af9caea0"} Dec 03 20:07:29.211358 master-0 kubenswrapper[9368]: I1203 20:07:29.211294 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/4.log" Dec 03 20:07:29.545069 master-0 kubenswrapper[9368]: I1203 20:07:29.544739 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:07:29.545462 master-0 kubenswrapper[9368]: E1203 20:07:29.545181 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:07:31.235888 master-0 kubenswrapper[9368]: I1203 20:07:31.235824 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/2.log" Dec 03 20:07:31.236499 master-0 kubenswrapper[9368]: I1203 20:07:31.236199 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/1.log" Dec 03 20:07:31.236499 master-0 kubenswrapper[9368]: I1203 20:07:31.236236 9368 generic.go:334] "Generic (PLEG): container finished" podID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" containerID="cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01" exitCode=255 Dec 03 20:07:31.236499 master-0 kubenswrapper[9368]: I1203 20:07:31.236266 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerDied","Data":"cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01"} Dec 03 20:07:31.236499 master-0 kubenswrapper[9368]: I1203 20:07:31.236297 9368 scope.go:117] "RemoveContainer" containerID="956eb1f870a19470f7c8b22853a74608f38c1690adb7d4ce8636e2637a784bb5" Dec 03 20:07:31.236915 master-0 kubenswrapper[9368]: I1203 20:07:31.236871 9368 scope.go:117] "RemoveContainer" containerID="cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01" Dec 03 20:07:31.237191 master-0 kubenswrapper[9368]: E1203 20:07:31.237150 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=csi-snapshot-controller-operator pod=csi-snapshot-controller-operator-7b795784b8-4gppw_openshift-cluster-storage-operator(b84835e3-e8bc-4aa4-a8f3-f9be702a358a)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" podUID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" Dec 03 20:07:32.246256 master-0 kubenswrapper[9368]: I1203 20:07:32.246160 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/2.log" Dec 03 20:07:32.247169 master-0 kubenswrapper[9368]: I1203 20:07:32.247111 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/1.log" Dec 03 20:07:32.247253 master-0 kubenswrapper[9368]: I1203 20:07:32.247172 9368 generic.go:334] "Generic (PLEG): container finished" podID="63e3d36d-1676-4f90-ac9a-d85b861a4655" containerID="9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda" exitCode=255 Dec 03 20:07:32.247328 master-0 kubenswrapper[9368]: I1203 20:07:32.247284 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerDied","Data":"9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda"} Dec 03 20:07:32.247388 master-0 kubenswrapper[9368]: I1203 20:07:32.247326 9368 scope.go:117] "RemoveContainer" containerID="6807dbf16e067be7ea486ac34b787f907c6cd7781565aaddc8b3f973b1b71212" Dec 03 20:07:32.248153 master-0 kubenswrapper[9368]: I1203 20:07:32.248095 9368 scope.go:117] "RemoveContainer" containerID="9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda" Dec 03 20:07:32.248581 master-0 kubenswrapper[9368]: E1203 20:07:32.248489 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-controller pod=service-ca-6b8bb995f7-bj4vz_openshift-service-ca(63e3d36d-1676-4f90-ac9a-d85b861a4655)\"" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" podUID="63e3d36d-1676-4f90-ac9a-d85b861a4655" Dec 03 20:07:32.250002 master-0 kubenswrapper[9368]: I1203 20:07:32.249960 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/2.log" Dec 03 20:07:32.545098 master-0 kubenswrapper[9368]: I1203 20:07:32.544907 9368 scope.go:117] "RemoveContainer" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" Dec 03 20:07:33.261928 master-0 kubenswrapper[9368]: I1203 20:07:33.261834 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/3.log" Dec 03 20:07:33.262979 master-0 kubenswrapper[9368]: I1203 20:07:33.262008 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" event={"ID":"78a864f2-934f-4197-9753-24c9bc7f1fca","Type":"ContainerStarted","Data":"a50351191867cc19f0f1a16f6b23f99fe51ba53bd7eedf1d7353c153b41763fe"} Dec 03 20:07:33.267648 master-0 kubenswrapper[9368]: I1203 20:07:33.267596 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/2.log" Dec 03 20:07:33.544596 master-0 kubenswrapper[9368]: I1203 20:07:33.544431 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:07:33.544940 master-0 kubenswrapper[9368]: E1203 20:07:33.544748 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:07:34.558306 master-0 kubenswrapper[9368]: I1203 20:07:34.558214 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:07:34.559116 master-0 kubenswrapper[9368]: E1203 20:07:34.558568 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:07:36.545180 master-0 kubenswrapper[9368]: I1203 20:07:36.545070 9368 scope.go:117] "RemoveContainer" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" Dec 03 20:07:36.545180 master-0 kubenswrapper[9368]: I1203 20:07:36.545153 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:07:36.546125 master-0 kubenswrapper[9368]: E1203 20:07:36.545520 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:07:36.546125 master-0 kubenswrapper[9368]: I1203 20:07:36.545544 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:07:36.546125 master-0 kubenswrapper[9368]: E1203 20:07:36.545763 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:07:37.297183 master-0 kubenswrapper[9368]: I1203 20:07:37.297148 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/3.log" Dec 03 20:07:37.297530 master-0 kubenswrapper[9368]: I1203 20:07:37.297501 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" event={"ID":"f749c7f2-1fd7-4078-a92d-0ae5523998ac","Type":"ContainerStarted","Data":"07d3fec38a2309754ce3167b8bd96267db84f4553a7918e45fa727cdc7acc09c"} Dec 03 20:07:37.544364 master-0 kubenswrapper[9368]: I1203 20:07:37.544287 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:07:37.544660 master-0 kubenswrapper[9368]: E1203 20:07:37.544599 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:07:38.544483 master-0 kubenswrapper[9368]: I1203 20:07:38.544385 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:07:38.912716 master-0 kubenswrapper[9368]: E1203 20:07:38.544711 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:07:39.544139 master-0 kubenswrapper[9368]: I1203 20:07:39.544073 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:07:39.544421 master-0 kubenswrapper[9368]: E1203 20:07:39.544315 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:07:39.544581 master-0 kubenswrapper[9368]: I1203 20:07:39.544513 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:07:39.545292 master-0 kubenswrapper[9368]: E1203 20:07:39.544909 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:42.339548 master-0 kubenswrapper[9368]: I1203 20:07:42.339437 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/0.log" Dec 03 20:07:42.340881 master-0 kubenswrapper[9368]: I1203 20:07:42.340822 9368 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e" exitCode=1 Dec 03 20:07:42.340986 master-0 kubenswrapper[9368]: I1203 20:07:42.340888 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerDied","Data":"89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e"} Dec 03 20:07:42.358102 master-0 kubenswrapper[9368]: I1203 20:07:42.358015 9368 scope.go:117] "RemoveContainer" containerID="89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e" Dec 03 20:07:42.361276 master-0 kubenswrapper[9368]: I1203 20:07:42.361213 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:07:42.547799 master-0 kubenswrapper[9368]: I1203 20:07:42.544939 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:07:42.547799 master-0 kubenswrapper[9368]: E1203 20:07:42.545089 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:07:43.356198 master-0 kubenswrapper[9368]: I1203 20:07:43.356117 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/0.log" Dec 03 20:07:43.357278 master-0 kubenswrapper[9368]: I1203 20:07:43.357233 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"88a426b4c066f4efd6c67dba2d50d1674139b8757075139f8541302d74a32ce6"} Dec 03 20:07:43.385791 master-0 kubenswrapper[9368]: I1203 20:07:43.385638 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=1.385608832 podStartE2EDuration="1.385608832s" podCreationTimestamp="2025-12-03 20:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:07:43.377648887 +0000 UTC m=+729.038898858" watchObservedRunningTime="2025-12-03 20:07:43.385608832 +0000 UTC m=+729.046858783" Dec 03 20:07:43.544007 master-0 kubenswrapper[9368]: I1203 20:07:43.543841 9368 scope.go:117] "RemoveContainer" containerID="cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01" Dec 03 20:07:43.544312 master-0 kubenswrapper[9368]: I1203 20:07:43.544152 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:07:43.544312 master-0 kubenswrapper[9368]: E1203 20:07:43.544194 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=csi-snapshot-controller-operator pod=csi-snapshot-controller-operator-7b795784b8-4gppw_openshift-cluster-storage-operator(b84835e3-e8bc-4aa4-a8f3-f9be702a358a)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" podUID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" Dec 03 20:07:43.544533 master-0 kubenswrapper[9368]: E1203 20:07:43.544483 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:07:46.544843 master-0 kubenswrapper[9368]: I1203 20:07:46.544741 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:07:46.545760 master-0 kubenswrapper[9368]: I1203 20:07:46.545644 9368 scope.go:117] "RemoveContainer" containerID="9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda" Dec 03 20:07:46.545879 master-0 kubenswrapper[9368]: E1203 20:07:46.545663 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:07:46.546551 master-0 kubenswrapper[9368]: E1203 20:07:46.546507 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-controller pod=service-ca-6b8bb995f7-bj4vz_openshift-service-ca(63e3d36d-1676-4f90-ac9a-d85b861a4655)\"" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" podUID="63e3d36d-1676-4f90-ac9a-d85b861a4655" Dec 03 20:07:47.303659 master-0 kubenswrapper[9368]: I1203 20:07:47.303557 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:07:47.303962 master-0 kubenswrapper[9368]: E1203 20:07:47.303926 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerName="route-controller-manager" Dec 03 20:07:47.303962 master-0 kubenswrapper[9368]: I1203 20:07:47.303951 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerName="route-controller-manager" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: E1203 20:07:47.303982 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: I1203 20:07:47.303995 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: E1203 20:07:47.304022 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="config-sync-controllers" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: I1203 20:07:47.304035 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="config-sync-controllers" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: E1203 20:07:47.304050 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="cluster-cloud-controller-manager" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: I1203 20:07:47.304062 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="cluster-cloud-controller-manager" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: E1203 20:07:47.304088 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: I1203 20:07:47.304102 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:07:47.304125 master-0 kubenswrapper[9368]: E1203 20:07:47.304132 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304146 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: E1203 20:07:47.304176 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="kube-rbac-proxy" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304189 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="kube-rbac-proxy" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: E1203 20:07:47.304235 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="machine-approver-controller" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304247 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="machine-approver-controller" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: E1203 20:07:47.304264 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="kube-rbac-proxy" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304281 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="kube-rbac-proxy" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: E1203 20:07:47.304312 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304328 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304581 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="config-sync-controllers" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304602 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cad72f-5bbf-42fc-9d63-545a01c98cbe" containerName="route-controller-manager" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304619 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="kube-rbac-proxy" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304634 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ca5373-413c-4824-ba19-13b99c3081e4" containerName="machine-approver-controller" Dec 03 20:07:47.304637 master-0 kubenswrapper[9368]: I1203 20:07:47.304649 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="cluster-cloud-controller-manager" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.304671 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b16a8a-27a2-4a07-b5f9-10a5be2ec870" containerName="kube-rbac-proxy" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.304688 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.304711 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.304729 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.304749 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:07:47.305436 master-0 kubenswrapper[9368]: I1203 20:07:47.305399 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.308605 master-0 kubenswrapper[9368]: I1203 20:07:47.308534 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 20:07:47.308605 master-0 kubenswrapper[9368]: I1203 20:07:47.308572 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 20:07:47.308918 master-0 kubenswrapper[9368]: I1203 20:07:47.308885 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 20:07:47.309404 master-0 kubenswrapper[9368]: I1203 20:07:47.309345 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 20:07:47.309679 master-0 kubenswrapper[9368]: I1203 20:07:47.309348 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 20:07:47.317930 master-0 kubenswrapper[9368]: I1203 20:07:47.317862 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:07:47.498794 master-0 kubenswrapper[9368]: I1203 20:07:47.498274 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.498794 master-0 kubenswrapper[9368]: I1203 20:07:47.498349 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.498794 master-0 kubenswrapper[9368]: I1203 20:07:47.498471 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.499120 master-0 kubenswrapper[9368]: I1203 20:07:47.499025 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.549990 master-0 kubenswrapper[9368]: I1203 20:07:47.549711 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:07:47.551476 master-0 kubenswrapper[9368]: I1203 20:07:47.551424 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.554756 master-0 kubenswrapper[9368]: I1203 20:07:47.554626 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:07:47.556090 master-0 kubenswrapper[9368]: I1203 20:07:47.556022 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.569596 master-0 kubenswrapper[9368]: I1203 20:07:47.569510 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:07:47.581565 master-0 kubenswrapper[9368]: I1203 20:07:47.581502 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:07:47.600808 master-0 kubenswrapper[9368]: I1203 20:07:47.600729 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.600808 master-0 kubenswrapper[9368]: I1203 20:07:47.600811 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.601094 master-0 kubenswrapper[9368]: I1203 20:07:47.600846 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt78x\" (UniqueName: \"kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.601094 master-0 kubenswrapper[9368]: I1203 20:07:47.600902 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.601094 master-0 kubenswrapper[9368]: I1203 20:07:47.600935 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.601094 master-0 kubenswrapper[9368]: I1203 20:07:47.600990 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.601287 master-0 kubenswrapper[9368]: I1203 20:07:47.601233 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.601413 master-0 kubenswrapper[9368]: I1203 20:07:47.601378 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.601470 master-0 kubenswrapper[9368]: I1203 20:07:47.601427 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.601536 master-0 kubenswrapper[9368]: I1203 20:07:47.601502 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwqx\" (UniqueName: \"kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.602670 master-0 kubenswrapper[9368]: I1203 20:07:47.602635 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.602761 master-0 kubenswrapper[9368]: I1203 20:07:47.602686 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.605674 master-0 kubenswrapper[9368]: I1203 20:07:47.605630 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.620486 master-0 kubenswrapper[9368]: I1203 20:07:47.620448 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.640425 master-0 kubenswrapper[9368]: I1203 20:07:47.640382 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:47.702769 master-0 kubenswrapper[9368]: I1203 20:07:47.702683 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.703013 master-0 kubenswrapper[9368]: I1203 20:07:47.702806 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.703013 master-0 kubenswrapper[9368]: I1203 20:07:47.702882 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.703013 master-0 kubenswrapper[9368]: I1203 20:07:47.702954 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwqx\" (UniqueName: \"kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.703013 master-0 kubenswrapper[9368]: I1203 20:07:47.702997 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.703183 master-0 kubenswrapper[9368]: I1203 20:07:47.703037 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt78x\" (UniqueName: \"kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.703543 master-0 kubenswrapper[9368]: I1203 20:07:47.703389 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.703543 master-0 kubenswrapper[9368]: I1203 20:07:47.703464 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.703543 master-0 kubenswrapper[9368]: I1203 20:07:47.703518 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.703688 master-0 kubenswrapper[9368]: I1203 20:07:47.703618 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.727405 master-0 kubenswrapper[9368]: I1203 20:07:47.727364 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwqx\" (UniqueName: \"kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx\") pod \"community-operators-bs7zk\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.727848 master-0 kubenswrapper[9368]: I1203 20:07:47.727760 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt78x\" (UniqueName: \"kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x\") pod \"certified-operators-hb9ml\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.873713 master-0 kubenswrapper[9368]: I1203 20:07:47.873679 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:47.892039 master-0 kubenswrapper[9368]: I1203 20:07:47.891643 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:47.913561 master-0 kubenswrapper[9368]: I1203 20:07:47.913485 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:07:47.919283 master-0 kubenswrapper[9368]: W1203 20:07:47.919243 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc52974d8_fbe6_444b_97ae_468482eebac8.slice/crio-a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a WatchSource:0}: Error finding container a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a: Status 404 returned error can't find the container with id a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a Dec 03 20:07:48.283547 master-0 kubenswrapper[9368]: I1203 20:07:48.283427 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:07:48.296498 master-0 kubenswrapper[9368]: W1203 20:07:48.295689 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5dddf7_f9a3_4169_89bd_dedee0aacdd9.slice/crio-2053a09db44895fb2c7bdb65e57180837936a611d76aaf8c24e1648e544c2463 WatchSource:0}: Error finding container 2053a09db44895fb2c7bdb65e57180837936a611d76aaf8c24e1648e544c2463: Status 404 returned error can't find the container with id 2053a09db44895fb2c7bdb65e57180837936a611d76aaf8c24e1648e544c2463 Dec 03 20:07:48.337378 master-0 kubenswrapper[9368]: I1203 20:07:48.337337 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:07:48.346546 master-0 kubenswrapper[9368]: W1203 20:07:48.346498 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd164eb04_3911_486e_a66b_828098a04e9f.slice/crio-429132f48eae082ad30a12cc017ed2e511b1e9dd5df5a1f63704adb2b682a31b WatchSource:0}: Error finding container 429132f48eae082ad30a12cc017ed2e511b1e9dd5df5a1f63704adb2b682a31b: Status 404 returned error can't find the container with id 429132f48eae082ad30a12cc017ed2e511b1e9dd5df5a1f63704adb2b682a31b Dec 03 20:07:48.402014 master-0 kubenswrapper[9368]: I1203 20:07:48.401965 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" event={"ID":"c52974d8-fbe6-444b-97ae-468482eebac8","Type":"ContainerStarted","Data":"fb26888be03097a323ec8a570f669f63df9f46f37a911bd2cc4812c68e4c8b64"} Dec 03 20:07:48.402633 master-0 kubenswrapper[9368]: I1203 20:07:48.402019 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" event={"ID":"c52974d8-fbe6-444b-97ae-468482eebac8","Type":"ContainerStarted","Data":"a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a"} Dec 03 20:07:48.403290 master-0 kubenswrapper[9368]: I1203 20:07:48.403028 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerStarted","Data":"2053a09db44895fb2c7bdb65e57180837936a611d76aaf8c24e1648e544c2463"} Dec 03 20:07:48.405199 master-0 kubenswrapper[9368]: I1203 20:07:48.405168 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerStarted","Data":"429132f48eae082ad30a12cc017ed2e511b1e9dd5df5a1f63704adb2b682a31b"} Dec 03 20:07:48.426912 master-0 kubenswrapper[9368]: I1203 20:07:48.426548 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" podStartSLOduration=672.426523969 podStartE2EDuration="11m12.426523969s" podCreationTimestamp="2025-12-03 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:07:48.423912779 +0000 UTC m=+734.085162730" watchObservedRunningTime="2025-12-03 20:07:48.426523969 +0000 UTC m=+734.087773900" Dec 03 20:07:49.415479 master-0 kubenswrapper[9368]: I1203 20:07:49.415389 9368 generic.go:334] "Generic (PLEG): container finished" podID="d164eb04-3911-486e-a66b-828098a04e9f" containerID="c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639" exitCode=0 Dec 03 20:07:49.416772 master-0 kubenswrapper[9368]: I1203 20:07:49.415518 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerDied","Data":"c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639"} Dec 03 20:07:49.417640 master-0 kubenswrapper[9368]: I1203 20:07:49.417550 9368 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 20:07:49.418303 master-0 kubenswrapper[9368]: I1203 20:07:49.418236 9368 generic.go:334] "Generic (PLEG): container finished" podID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerID="9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135" exitCode=0 Dec 03 20:07:49.418663 master-0 kubenswrapper[9368]: I1203 20:07:49.418613 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerDied","Data":"9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135"} Dec 03 20:07:49.419269 master-0 kubenswrapper[9368]: I1203 20:07:49.419229 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:49.437231 master-0 kubenswrapper[9368]: I1203 20:07:49.436862 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:07:49.543927 master-0 kubenswrapper[9368]: I1203 20:07:49.543840 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:07:49.544307 master-0 kubenswrapper[9368]: I1203 20:07:49.544166 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:07:49.544307 master-0 kubenswrapper[9368]: E1203 20:07:49.544201 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:07:49.544307 master-0 kubenswrapper[9368]: E1203 20:07:49.544573 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:07:49.545569 master-0 kubenswrapper[9368]: I1203 20:07:49.545485 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:07:49.546164 master-0 kubenswrapper[9368]: E1203 20:07:49.546070 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:07:49.716391 master-0 kubenswrapper[9368]: I1203 20:07:49.716181 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:07:49.718012 master-0 kubenswrapper[9368]: I1203 20:07:49.717938 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.732939 master-0 kubenswrapper[9368]: I1203 20:07:49.732469 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:07:49.829408 master-0 kubenswrapper[9368]: I1203 20:07:49.829311 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.829674 master-0 kubenswrapper[9368]: I1203 20:07:49.829422 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25kbk\" (UniqueName: \"kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.829674 master-0 kubenswrapper[9368]: I1203 20:07:49.829505 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.931652 master-0 kubenswrapper[9368]: I1203 20:07:49.931356 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.931652 master-0 kubenswrapper[9368]: I1203 20:07:49.931611 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25kbk\" (UniqueName: \"kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.932128 master-0 kubenswrapper[9368]: I1203 20:07:49.931840 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.932201 master-0 kubenswrapper[9368]: I1203 20:07:49.932171 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.932630 master-0 kubenswrapper[9368]: I1203 20:07:49.932557 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:49.932750 master-0 kubenswrapper[9368]: I1203 20:07:49.932652 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:07:49.935432 master-0 kubenswrapper[9368]: I1203 20:07:49.935122 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:49.946434 master-0 kubenswrapper[9368]: I1203 20:07:49.946319 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:07:49.957442 master-0 kubenswrapper[9368]: I1203 20:07:49.957344 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25kbk\" (UniqueName: \"kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk\") pod \"redhat-operators-vnt24\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:50.033259 master-0 kubenswrapper[9368]: I1203 20:07:50.033105 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.033495 master-0 kubenswrapper[9368]: I1203 20:07:50.033318 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qv2\" (UniqueName: \"kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.033495 master-0 kubenswrapper[9368]: I1203 20:07:50.033442 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.059736 master-0 kubenswrapper[9368]: I1203 20:07:50.059686 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:07:50.134333 master-0 kubenswrapper[9368]: I1203 20:07:50.134240 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qv2\" (UniqueName: \"kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.134333 master-0 kubenswrapper[9368]: I1203 20:07:50.134322 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.134508 master-0 kubenswrapper[9368]: I1203 20:07:50.134363 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.134980 master-0 kubenswrapper[9368]: I1203 20:07:50.134938 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.135159 master-0 kubenswrapper[9368]: I1203 20:07:50.135110 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.159073 master-0 kubenswrapper[9368]: I1203 20:07:50.157521 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qv2\" (UniqueName: \"kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2\") pod \"redhat-marketplace-qnhnn\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.296034 master-0 kubenswrapper[9368]: I1203 20:07:50.294126 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:07:50.431339 master-0 kubenswrapper[9368]: I1203 20:07:50.428899 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerStarted","Data":"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274"} Dec 03 20:07:50.436316 master-0 kubenswrapper[9368]: I1203 20:07:50.434204 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerStarted","Data":"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0"} Dec 03 20:07:50.530379 master-0 kubenswrapper[9368]: I1203 20:07:50.530311 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:07:50.545539 master-0 kubenswrapper[9368]: I1203 20:07:50.544461 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:07:50.545539 master-0 kubenswrapper[9368]: E1203 20:07:50.544657 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:07:50.658177 master-0 kubenswrapper[9368]: W1203 20:07:50.658099 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b507554_0ccf_474e_b674_3546d174419d.slice/crio-666ad51f38944e2835a47dc2eef12cdc4e1a1f7de33cf39c376607a4fac3d13f WatchSource:0}: Error finding container 666ad51f38944e2835a47dc2eef12cdc4e1a1f7de33cf39c376607a4fac3d13f: Status 404 returned error can't find the container with id 666ad51f38944e2835a47dc2eef12cdc4e1a1f7de33cf39c376607a4fac3d13f Dec 03 20:07:50.737456 master-0 kubenswrapper[9368]: I1203 20:07:50.737405 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:07:50.752795 master-0 kubenswrapper[9368]: W1203 20:07:50.752736 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod190cec37_48ec_49ed_b954_0874a0adaa6b.slice/crio-69e77e77670a597953c26f68f2df83dffd82b040457787598e8a59bb203a8f46 WatchSource:0}: Error finding container 69e77e77670a597953c26f68f2df83dffd82b040457787598e8a59bb203a8f46: Status 404 returned error can't find the container with id 69e77e77670a597953c26f68f2df83dffd82b040457787598e8a59bb203a8f46 Dec 03 20:07:50.981684 master-0 kubenswrapper[9368]: I1203 20:07:50.981619 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf"] Dec 03 20:07:50.985830 master-0 kubenswrapper[9368]: I1203 20:07:50.982715 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:50.989826 master-0 kubenswrapper[9368]: I1203 20:07:50.989091 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 20:07:50.989826 master-0 kubenswrapper[9368]: I1203 20:07:50.989340 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 20:07:50.993819 master-0 kubenswrapper[9368]: I1203 20:07:50.990112 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 20:07:50.993819 master-0 kubenswrapper[9368]: I1203 20:07:50.990274 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 20:07:50.993819 master-0 kubenswrapper[9368]: I1203 20:07:50.990416 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 20:07:50.993819 master-0 kubenswrapper[9368]: I1203 20:07:50.990989 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75"] Dec 03 20:07:50.993819 master-0 kubenswrapper[9368]: I1203 20:07:50.992220 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.001818 master-0 kubenswrapper[9368]: I1203 20:07:50.998436 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:07:51.001818 master-0 kubenswrapper[9368]: I1203 20:07:50.998649 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:07:51.001818 master-0 kubenswrapper[9368]: I1203 20:07:50.998868 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 20:07:51.001818 master-0 kubenswrapper[9368]: I1203 20:07:50.998998 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 20:07:51.001818 master-0 kubenswrapper[9368]: I1203 20:07:50.999129 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 20:07:51.150254 master-0 kubenswrapper[9368]: I1203 20:07:51.150200 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcmd\" (UniqueName: \"kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.150424 master-0 kubenswrapper[9368]: I1203 20:07:51.150261 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.150424 master-0 kubenswrapper[9368]: I1203 20:07:51.150288 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.150424 master-0 kubenswrapper[9368]: I1203 20:07:51.150334 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.150561 master-0 kubenswrapper[9368]: I1203 20:07:51.150511 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.150642 master-0 kubenswrapper[9368]: I1203 20:07:51.150612 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.150971 master-0 kubenswrapper[9368]: I1203 20:07:51.150669 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.150971 master-0 kubenswrapper[9368]: I1203 20:07:51.150702 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.150971 master-0 kubenswrapper[9368]: I1203 20:07:51.150856 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.251766 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.251890 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.251939 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.251975 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252045 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252102 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcmd\" (UniqueName: \"kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252158 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252204 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252251 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252264 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.252834 master-0 kubenswrapper[9368]: I1203 20:07:51.252740 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.253579 master-0 kubenswrapper[9368]: I1203 20:07:51.253012 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.253579 master-0 kubenswrapper[9368]: I1203 20:07:51.253015 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.253579 master-0 kubenswrapper[9368]: I1203 20:07:51.253103 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.257177 master-0 kubenswrapper[9368]: I1203 20:07:51.257100 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.258512 master-0 kubenswrapper[9368]: I1203 20:07:51.258441 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.269242 master-0 kubenswrapper[9368]: I1203 20:07:51.269174 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcmd\" (UniqueName: \"kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.281383 master-0 kubenswrapper[9368]: I1203 20:07:51.281306 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.359510 master-0 kubenswrapper[9368]: I1203 20:07:51.359437 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:07:51.373091 master-0 kubenswrapper[9368]: I1203 20:07:51.373024 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:07:51.378007 master-0 kubenswrapper[9368]: W1203 20:07:51.377929 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f5df5c_fd9b_430d_aecc_242594b4aff1.slice/crio-94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099 WatchSource:0}: Error finding container 94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099: Status 404 returned error can't find the container with id 94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099 Dec 03 20:07:51.406296 master-0 kubenswrapper[9368]: W1203 20:07:51.406234 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90610a53_b590_491e_8014_f0704afdc6e1.slice/crio-d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2 WatchSource:0}: Error finding container d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2: Status 404 returned error can't find the container with id d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2 Dec 03 20:07:51.447582 master-0 kubenswrapper[9368]: I1203 20:07:51.447528 9368 generic.go:334] "Generic (PLEG): container finished" podID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerID="51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274" exitCode=0 Dec 03 20:07:51.448100 master-0 kubenswrapper[9368]: I1203 20:07:51.447611 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerDied","Data":"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274"} Dec 03 20:07:51.450466 master-0 kubenswrapper[9368]: I1203 20:07:51.450275 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerDied","Data":"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0"} Dec 03 20:07:51.450591 master-0 kubenswrapper[9368]: I1203 20:07:51.450567 9368 generic.go:334] "Generic (PLEG): container finished" podID="d164eb04-3911-486e-a66b-828098a04e9f" containerID="ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0" exitCode=0 Dec 03 20:07:51.451918 master-0 kubenswrapper[9368]: I1203 20:07:51.451882 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" event={"ID":"90610a53-b590-491e-8014-f0704afdc6e1","Type":"ContainerStarted","Data":"d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2"} Dec 03 20:07:51.455070 master-0 kubenswrapper[9368]: I1203 20:07:51.454927 9368 generic.go:334] "Generic (PLEG): container finished" podID="4b507554-0ccf-474e-b674-3546d174419d" containerID="bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0" exitCode=0 Dec 03 20:07:51.455070 master-0 kubenswrapper[9368]: I1203 20:07:51.454974 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerDied","Data":"bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0"} Dec 03 20:07:51.455070 master-0 kubenswrapper[9368]: I1203 20:07:51.454992 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerStarted","Data":"666ad51f38944e2835a47dc2eef12cdc4e1a1f7de33cf39c376607a4fac3d13f"} Dec 03 20:07:51.457678 master-0 kubenswrapper[9368]: I1203 20:07:51.457591 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" event={"ID":"09f5df5c-fd9b-430d-aecc-242594b4aff1","Type":"ContainerStarted","Data":"94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099"} Dec 03 20:07:51.464669 master-0 kubenswrapper[9368]: I1203 20:07:51.464628 9368 generic.go:334] "Generic (PLEG): container finished" podID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerID="7d1e43b364132c9e6375ff146b7933b43cca5922fef062c3e7e4a746d696b9bb" exitCode=0 Dec 03 20:07:51.464902 master-0 kubenswrapper[9368]: I1203 20:07:51.464813 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerDied","Data":"7d1e43b364132c9e6375ff146b7933b43cca5922fef062c3e7e4a746d696b9bb"} Dec 03 20:07:51.464962 master-0 kubenswrapper[9368]: I1203 20:07:51.464913 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerStarted","Data":"69e77e77670a597953c26f68f2df83dffd82b040457787598e8a59bb203a8f46"} Dec 03 20:07:51.545259 master-0 kubenswrapper[9368]: I1203 20:07:51.544999 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:07:51.545259 master-0 kubenswrapper[9368]: E1203 20:07:51.545220 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:07:51.545669 master-0 kubenswrapper[9368]: I1203 20:07:51.545540 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:07:51.545825 master-0 kubenswrapper[9368]: E1203 20:07:51.545790 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:07:52.475277 master-0 kubenswrapper[9368]: I1203 20:07:52.475114 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerStarted","Data":"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee"} Dec 03 20:07:52.478804 master-0 kubenswrapper[9368]: I1203 20:07:52.478752 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerStarted","Data":"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389"} Dec 03 20:07:52.483199 master-0 kubenswrapper[9368]: I1203 20:07:52.483105 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerStarted","Data":"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69"} Dec 03 20:07:52.485918 master-0 kubenswrapper[9368]: I1203 20:07:52.485872 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" event={"ID":"90610a53-b590-491e-8014-f0704afdc6e1","Type":"ContainerStarted","Data":"b77eb867ebf0b9ba1fc709801804fd9cc85f08bba70173293a137b2ac6354db6"} Dec 03 20:07:52.485985 master-0 kubenswrapper[9368]: I1203 20:07:52.485928 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" event={"ID":"90610a53-b590-491e-8014-f0704afdc6e1","Type":"ContainerStarted","Data":"9946cce66dc556dd131344d03650b8faa9a0b2b63bed8a11cd1e6f07c23a90dd"} Dec 03 20:07:52.485985 master-0 kubenswrapper[9368]: I1203 20:07:52.485950 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" event={"ID":"90610a53-b590-491e-8014-f0704afdc6e1","Type":"ContainerStarted","Data":"67ff3b2d6e133fd0ed91d67c9da0927b74c01228c3f55d83514f9585b8b0a727"} Dec 03 20:07:52.488082 master-0 kubenswrapper[9368]: I1203 20:07:52.488048 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" event={"ID":"09f5df5c-fd9b-430d-aecc-242594b4aff1","Type":"ContainerStarted","Data":"7a84a74371b36244b4a068cdaa96542ed06737646b31eaba5baa599a80646eae"} Dec 03 20:07:52.488082 master-0 kubenswrapper[9368]: I1203 20:07:52.488079 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" event={"ID":"09f5df5c-fd9b-430d-aecc-242594b4aff1","Type":"ContainerStarted","Data":"83e12708a0bc9962a4edaac4a6054c57e691fbd74f8eab8478636975adb6434a"} Dec 03 20:07:52.754652 master-0 kubenswrapper[9368]: I1203 20:07:52.754493 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" podStartSLOduration=2.754472801 podStartE2EDuration="2.754472801s" podCreationTimestamp="2025-12-03 20:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:07:52.75399364 +0000 UTC m=+738.415243561" watchObservedRunningTime="2025-12-03 20:07:52.754472801 +0000 UTC m=+738.415722712" Dec 03 20:07:52.789525 master-0 kubenswrapper[9368]: I1203 20:07:52.789433 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hb9ml" podStartSLOduration=3.3455354919999998 podStartE2EDuration="5.789410593s" podCreationTimestamp="2025-12-03 20:07:47 +0000 UTC" firstStartedPulling="2025-12-03 20:07:49.420679726 +0000 UTC m=+735.081929677" lastFinishedPulling="2025-12-03 20:07:51.864554867 +0000 UTC m=+737.525804778" observedRunningTime="2025-12-03 20:07:52.785439831 +0000 UTC m=+738.446689742" watchObservedRunningTime="2025-12-03 20:07:52.789410593 +0000 UTC m=+738.450660504" Dec 03 20:07:52.809470 master-0 kubenswrapper[9368]: I1203 20:07:52.809390 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" podStartSLOduration=2.809369142 podStartE2EDuration="2.809369142s" podCreationTimestamp="2025-12-03 20:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:07:52.805977643 +0000 UTC m=+738.467227564" watchObservedRunningTime="2025-12-03 20:07:52.809369142 +0000 UTC m=+738.470619063" Dec 03 20:07:52.834756 master-0 kubenswrapper[9368]: I1203 20:07:52.834685 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bs7zk" podStartSLOduration=3.381774865 podStartE2EDuration="5.834666042s" podCreationTimestamp="2025-12-03 20:07:47 +0000 UTC" firstStartedPulling="2025-12-03 20:07:49.417484853 +0000 UTC m=+735.078734804" lastFinishedPulling="2025-12-03 20:07:51.87037607 +0000 UTC m=+737.531625981" observedRunningTime="2025-12-03 20:07:52.831633882 +0000 UTC m=+738.492883803" watchObservedRunningTime="2025-12-03 20:07:52.834666042 +0000 UTC m=+738.495915953" Dec 03 20:07:53.494751 master-0 kubenswrapper[9368]: I1203 20:07:53.494701 9368 generic.go:334] "Generic (PLEG): container finished" podID="4b507554-0ccf-474e-b674-3546d174419d" containerID="36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee" exitCode=0 Dec 03 20:07:53.495529 master-0 kubenswrapper[9368]: I1203 20:07:53.494803 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerDied","Data":"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee"} Dec 03 20:07:53.496795 master-0 kubenswrapper[9368]: I1203 20:07:53.496745 9368 generic.go:334] "Generic (PLEG): container finished" podID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerID="e9c08afd3d28191fbb6f509fcc0f2344c9eccd21542f10fe794a334e7ec795da" exitCode=0 Dec 03 20:07:53.498871 master-0 kubenswrapper[9368]: I1203 20:07:53.496912 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerDied","Data":"e9c08afd3d28191fbb6f509fcc0f2344c9eccd21542f10fe794a334e7ec795da"} Dec 03 20:07:53.544281 master-0 kubenswrapper[9368]: I1203 20:07:53.544234 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:07:53.544497 master-0 kubenswrapper[9368]: E1203 20:07:53.544463 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=bootstrap-kube-controller-manager-master-0_kube-system(7bce50c457ac1f4721bc81a570dd238a)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" Dec 03 20:07:54.507139 master-0 kubenswrapper[9368]: I1203 20:07:54.507076 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerStarted","Data":"527fa36c882adc00ec5671ba7a7a7e111d413604162f96e6c1645a6e9c47f1c1"} Dec 03 20:07:54.510832 master-0 kubenswrapper[9368]: I1203 20:07:54.510789 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerStarted","Data":"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0"} Dec 03 20:07:54.527249 master-0 kubenswrapper[9368]: I1203 20:07:54.527172 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnhnn" podStartSLOduration=3.120368867 podStartE2EDuration="5.527155286s" podCreationTimestamp="2025-12-03 20:07:49 +0000 UTC" firstStartedPulling="2025-12-03 20:07:51.467900162 +0000 UTC m=+737.129150073" lastFinishedPulling="2025-12-03 20:07:53.874686581 +0000 UTC m=+739.535936492" observedRunningTime="2025-12-03 20:07:54.526332567 +0000 UTC m=+740.187582478" watchObservedRunningTime="2025-12-03 20:07:54.527155286 +0000 UTC m=+740.188405197" Dec 03 20:07:54.545199 master-0 kubenswrapper[9368]: I1203 20:07:54.545138 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:07:54.545630 master-0 kubenswrapper[9368]: E1203 20:07:54.545585 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:07:54.564310 master-0 kubenswrapper[9368]: I1203 20:07:54.564199 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vnt24" podStartSLOduration=3.109541819 podStartE2EDuration="5.564176006s" podCreationTimestamp="2025-12-03 20:07:49 +0000 UTC" firstStartedPulling="2025-12-03 20:07:51.457230478 +0000 UTC m=+737.118480419" lastFinishedPulling="2025-12-03 20:07:53.911864695 +0000 UTC m=+739.573114606" observedRunningTime="2025-12-03 20:07:54.559816216 +0000 UTC m=+740.221066127" watchObservedRunningTime="2025-12-03 20:07:54.564176006 +0000 UTC m=+740.225425927" Dec 03 20:07:56.544434 master-0 kubenswrapper[9368]: I1203 20:07:56.544372 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:07:56.545075 master-0 kubenswrapper[9368]: I1203 20:07:56.544442 9368 scope.go:117] "RemoveContainer" containerID="cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01" Dec 03 20:07:56.545075 master-0 kubenswrapper[9368]: E1203 20:07:56.544602 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-8jzqh_openshift-controller-manager-operator(daa8efc0-4514-4a14-80f5-ab9eca53a127)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" podUID="daa8efc0-4514-4a14-80f5-ab9eca53a127" Dec 03 20:07:57.532440 master-0 kubenswrapper[9368]: I1203 20:07:57.532357 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/2.log" Dec 03 20:07:57.532440 master-0 kubenswrapper[9368]: I1203 20:07:57.532437 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" event={"ID":"b84835e3-e8bc-4aa4-a8f3-f9be702a358a","Type":"ContainerStarted","Data":"8c5ada18728356f7491aa385a58cbac2724726ee4edfbbaf1720e887daf78551"} Dec 03 20:07:57.543931 master-0 kubenswrapper[9368]: I1203 20:07:57.543880 9368 scope.go:117] "RemoveContainer" containerID="9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda" Dec 03 20:07:57.875203 master-0 kubenswrapper[9368]: I1203 20:07:57.875034 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:57.875203 master-0 kubenswrapper[9368]: I1203 20:07:57.875109 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:57.893084 master-0 kubenswrapper[9368]: I1203 20:07:57.892999 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:57.893084 master-0 kubenswrapper[9368]: I1203 20:07:57.893075 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:57.932409 master-0 kubenswrapper[9368]: I1203 20:07:57.932329 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:57.963489 master-0 kubenswrapper[9368]: I1203 20:07:57.963350 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:58.594055 master-0 kubenswrapper[9368]: I1203 20:07:58.594023 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:07:58.596082 master-0 kubenswrapper[9368]: I1203 20:07:58.596045 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:07:59.506251 master-0 kubenswrapper[9368]: I1203 20:07:59.506187 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:07:59.544529 master-0 kubenswrapper[9368]: I1203 20:07:59.544459 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:07:59.544819 master-0 kubenswrapper[9368]: E1203 20:07:59.544731 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:07:59.550322 master-0 kubenswrapper[9368]: I1203 20:07:59.550285 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/2.log" Dec 03 20:07:59.550594 master-0 kubenswrapper[9368]: I1203 20:07:59.550398 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" event={"ID":"63e3d36d-1676-4f90-ac9a-d85b861a4655","Type":"ContainerStarted","Data":"314686e4bb12cccbaa32c3dc40dda65aa95cdc54824eb41e5778decce7ddbd0e"} Dec 03 20:08:00.060267 master-0 kubenswrapper[9368]: I1203 20:08:00.060163 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:00.060267 master-0 kubenswrapper[9368]: I1203 20:08:00.060258 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:00.294606 master-0 kubenswrapper[9368]: I1203 20:08:00.294520 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:00.294606 master-0 kubenswrapper[9368]: I1203 20:08:00.294597 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:00.361571 master-0 kubenswrapper[9368]: I1203 20:08:00.361465 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:00.544557 master-0 kubenswrapper[9368]: I1203 20:08:00.544493 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:08:00.545566 master-0 kubenswrapper[9368]: E1203 20:08:00.544892 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-79ccj_openshift-kube-controller-manager-operator(e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" podUID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" Dec 03 20:08:00.560089 master-0 kubenswrapper[9368]: I1203 20:08:00.559999 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hb9ml" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="registry-server" containerID="cri-o://50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389" gracePeriod=2 Dec 03 20:08:00.622725 master-0 kubenswrapper[9368]: I1203 20:08:00.622557 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:00.906379 master-0 kubenswrapper[9368]: I1203 20:08:00.906174 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:08:00.907143 master-0 kubenswrapper[9368]: I1203 20:08:00.906438 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bs7zk" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="registry-server" containerID="cri-o://ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69" gracePeriod=2 Dec 03 20:08:01.057836 master-0 kubenswrapper[9368]: I1203 20:08:01.057763 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:08:01.081642 master-0 kubenswrapper[9368]: I1203 20:08:01.080640 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt78x\" (UniqueName: \"kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x\") pod \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " Dec 03 20:08:01.081642 master-0 kubenswrapper[9368]: I1203 20:08:01.080820 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content\") pod \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " Dec 03 20:08:01.081642 master-0 kubenswrapper[9368]: I1203 20:08:01.080846 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities\") pod \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\" (UID: \"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9\") " Dec 03 20:08:01.083267 master-0 kubenswrapper[9368]: I1203 20:08:01.083209 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities" (OuterVolumeSpecName: "utilities") pod "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" (UID: "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:01.095889 master-0 kubenswrapper[9368]: I1203 20:08:01.093051 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x" (OuterVolumeSpecName: "kube-api-access-pt78x") pod "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" (UID: "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9"). InnerVolumeSpecName "kube-api-access-pt78x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:01.126024 master-0 kubenswrapper[9368]: I1203 20:08:01.125958 9368 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vnt24" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="registry-server" probeResult="failure" output=< Dec 03 20:08:01.126024 master-0 kubenswrapper[9368]: timeout: failed to connect service ":50051" within 1s Dec 03 20:08:01.126024 master-0 kubenswrapper[9368]: > Dec 03 20:08:01.173508 master-0 kubenswrapper[9368]: I1203 20:08:01.173368 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" (UID: "4d5dddf7-f9a3-4169-89bd-dedee0aacdd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:01.184648 master-0 kubenswrapper[9368]: I1203 20:08:01.184579 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.184648 master-0 kubenswrapper[9368]: I1203 20:08:01.184628 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.184648 master-0 kubenswrapper[9368]: I1203 20:08:01.184641 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt78x\" (UniqueName: \"kubernetes.io/projected/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9-kube-api-access-pt78x\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.335027 master-0 kubenswrapper[9368]: I1203 20:08:01.334939 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:08:01.386967 master-0 kubenswrapper[9368]: I1203 20:08:01.386877 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvwqx\" (UniqueName: \"kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx\") pod \"d164eb04-3911-486e-a66b-828098a04e9f\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " Dec 03 20:08:01.387208 master-0 kubenswrapper[9368]: I1203 20:08:01.387055 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities\") pod \"d164eb04-3911-486e-a66b-828098a04e9f\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " Dec 03 20:08:01.387208 master-0 kubenswrapper[9368]: I1203 20:08:01.387168 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content\") pod \"d164eb04-3911-486e-a66b-828098a04e9f\" (UID: \"d164eb04-3911-486e-a66b-828098a04e9f\") " Dec 03 20:08:01.389079 master-0 kubenswrapper[9368]: I1203 20:08:01.388958 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities" (OuterVolumeSpecName: "utilities") pod "d164eb04-3911-486e-a66b-828098a04e9f" (UID: "d164eb04-3911-486e-a66b-828098a04e9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:01.389565 master-0 kubenswrapper[9368]: I1203 20:08:01.389509 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx" (OuterVolumeSpecName: "kube-api-access-tvwqx") pod "d164eb04-3911-486e-a66b-828098a04e9f" (UID: "d164eb04-3911-486e-a66b-828098a04e9f"). InnerVolumeSpecName "kube-api-access-tvwqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:01.488990 master-0 kubenswrapper[9368]: I1203 20:08:01.488827 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvwqx\" (UniqueName: \"kubernetes.io/projected/d164eb04-3911-486e-a66b-828098a04e9f-kube-api-access-tvwqx\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.488990 master-0 kubenswrapper[9368]: I1203 20:08:01.488887 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.545267 master-0 kubenswrapper[9368]: I1203 20:08:01.545155 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:08:01.546109 master-0 kubenswrapper[9368]: E1203 20:08:01.545607 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-v6rp5_openshift-service-ca-operator(01d51d9a-9beb-4357-9dc2-aeac210cd0c4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" podUID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" Dec 03 20:08:01.569740 master-0 kubenswrapper[9368]: I1203 20:08:01.569663 9368 generic.go:334] "Generic (PLEG): container finished" podID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerID="50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389" exitCode=0 Dec 03 20:08:01.569950 master-0 kubenswrapper[9368]: I1203 20:08:01.569736 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerDied","Data":"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389"} Dec 03 20:08:01.569950 master-0 kubenswrapper[9368]: I1203 20:08:01.569864 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hb9ml" event={"ID":"4d5dddf7-f9a3-4169-89bd-dedee0aacdd9","Type":"ContainerDied","Data":"2053a09db44895fb2c7bdb65e57180837936a611d76aaf8c24e1648e544c2463"} Dec 03 20:08:01.569950 master-0 kubenswrapper[9368]: I1203 20:08:01.569887 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hb9ml" Dec 03 20:08:01.569950 master-0 kubenswrapper[9368]: I1203 20:08:01.569913 9368 scope.go:117] "RemoveContainer" containerID="50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389" Dec 03 20:08:01.572858 master-0 kubenswrapper[9368]: I1203 20:08:01.572808 9368 generic.go:334] "Generic (PLEG): container finished" podID="d164eb04-3911-486e-a66b-828098a04e9f" containerID="ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69" exitCode=0 Dec 03 20:08:01.572960 master-0 kubenswrapper[9368]: I1203 20:08:01.572877 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerDied","Data":"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69"} Dec 03 20:08:01.572960 master-0 kubenswrapper[9368]: I1203 20:08:01.572918 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs7zk" Dec 03 20:08:01.572960 master-0 kubenswrapper[9368]: I1203 20:08:01.572928 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs7zk" event={"ID":"d164eb04-3911-486e-a66b-828098a04e9f","Type":"ContainerDied","Data":"429132f48eae082ad30a12cc017ed2e511b1e9dd5df5a1f63704adb2b682a31b"} Dec 03 20:08:01.588488 master-0 kubenswrapper[9368]: I1203 20:08:01.588399 9368 scope.go:117] "RemoveContainer" containerID="51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274" Dec 03 20:08:01.614773 master-0 kubenswrapper[9368]: I1203 20:08:01.614725 9368 scope.go:117] "RemoveContainer" containerID="9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135" Dec 03 20:08:01.620884 master-0 kubenswrapper[9368]: I1203 20:08:01.620820 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d164eb04-3911-486e-a66b-828098a04e9f" (UID: "d164eb04-3911-486e-a66b-828098a04e9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:01.632804 master-0 kubenswrapper[9368]: I1203 20:08:01.632735 9368 scope.go:117] "RemoveContainer" containerID="50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389" Dec 03 20:08:01.633165 master-0 kubenswrapper[9368]: E1203 20:08:01.633115 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389\": container with ID starting with 50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389 not found: ID does not exist" containerID="50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389" Dec 03 20:08:01.633284 master-0 kubenswrapper[9368]: I1203 20:08:01.633167 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389"} err="failed to get container status \"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389\": rpc error: code = NotFound desc = could not find container \"50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389\": container with ID starting with 50f2ca5f355904fd35520fc293fbaef74acbdec1dd8237e3cec6e9a68b8c2389 not found: ID does not exist" Dec 03 20:08:01.633284 master-0 kubenswrapper[9368]: I1203 20:08:01.633190 9368 scope.go:117] "RemoveContainer" containerID="51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274" Dec 03 20:08:01.633613 master-0 kubenswrapper[9368]: E1203 20:08:01.633570 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274\": container with ID starting with 51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274 not found: ID does not exist" containerID="51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274" Dec 03 20:08:01.633713 master-0 kubenswrapper[9368]: I1203 20:08:01.633625 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274"} err="failed to get container status \"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274\": rpc error: code = NotFound desc = could not find container \"51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274\": container with ID starting with 51f872968f7dfc0454c4bd867d2abc6f217f582b4721299fe79cd5b6089b9274 not found: ID does not exist" Dec 03 20:08:01.633713 master-0 kubenswrapper[9368]: I1203 20:08:01.633709 9368 scope.go:117] "RemoveContainer" containerID="9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135" Dec 03 20:08:01.634110 master-0 kubenswrapper[9368]: E1203 20:08:01.634082 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135\": container with ID starting with 9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135 not found: ID does not exist" containerID="9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135" Dec 03 20:08:01.634210 master-0 kubenswrapper[9368]: I1203 20:08:01.634109 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135"} err="failed to get container status \"9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135\": rpc error: code = NotFound desc = could not find container \"9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135\": container with ID starting with 9acc151a0309db37cab47dfef4aff287d328a4b21d2f88e157cf49cb2c1eb135 not found: ID does not exist" Dec 03 20:08:01.634210 master-0 kubenswrapper[9368]: I1203 20:08:01.634125 9368 scope.go:117] "RemoveContainer" containerID="ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69" Dec 03 20:08:01.654439 master-0 kubenswrapper[9368]: I1203 20:08:01.654402 9368 scope.go:117] "RemoveContainer" containerID="ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0" Dec 03 20:08:01.679809 master-0 kubenswrapper[9368]: I1203 20:08:01.679730 9368 scope.go:117] "RemoveContainer" containerID="c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639" Dec 03 20:08:01.694762 master-0 kubenswrapper[9368]: I1203 20:08:01.694671 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d164eb04-3911-486e-a66b-828098a04e9f-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:01.724978 master-0 kubenswrapper[9368]: I1203 20:08:01.724906 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:08:01.740043 master-0 kubenswrapper[9368]: I1203 20:08:01.735138 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hb9ml"] Dec 03 20:08:01.746800 master-0 kubenswrapper[9368]: I1203 20:08:01.746146 9368 scope.go:117] "RemoveContainer" containerID="ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69" Dec 03 20:08:01.750802 master-0 kubenswrapper[9368]: E1203 20:08:01.747380 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69\": container with ID starting with ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69 not found: ID does not exist" containerID="ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69" Dec 03 20:08:01.750802 master-0 kubenswrapper[9368]: I1203 20:08:01.747424 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69"} err="failed to get container status \"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69\": rpc error: code = NotFound desc = could not find container \"ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69\": container with ID starting with ea336a5c6da87525e71ebeeed9d24688824102b55e1392263985ce7edfe9cc69 not found: ID does not exist" Dec 03 20:08:01.750802 master-0 kubenswrapper[9368]: I1203 20:08:01.747453 9368 scope.go:117] "RemoveContainer" containerID="ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0" Dec 03 20:08:01.752025 master-0 kubenswrapper[9368]: E1203 20:08:01.751822 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0\": container with ID starting with ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0 not found: ID does not exist" containerID="ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0" Dec 03 20:08:01.752025 master-0 kubenswrapper[9368]: I1203 20:08:01.751899 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0"} err="failed to get container status \"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0\": rpc error: code = NotFound desc = could not find container \"ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0\": container with ID starting with ed5cf6bbbd1d743d847de0bbebfc49b46a16930cec22cd1dccc435991c4b1bc0 not found: ID does not exist" Dec 03 20:08:01.752025 master-0 kubenswrapper[9368]: I1203 20:08:01.751943 9368 scope.go:117] "RemoveContainer" containerID="c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639" Dec 03 20:08:01.753401 master-0 kubenswrapper[9368]: E1203 20:08:01.753355 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639\": container with ID starting with c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639 not found: ID does not exist" containerID="c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639" Dec 03 20:08:01.753461 master-0 kubenswrapper[9368]: I1203 20:08:01.753398 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639"} err="failed to get container status \"c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639\": rpc error: code = NotFound desc = could not find container \"c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639\": container with ID starting with c6059341c595203e9601330dcbb748e3aa38e2ceee758569d4d6a2c774d57639 not found: ID does not exist" Dec 03 20:08:01.911621 master-0 kubenswrapper[9368]: I1203 20:08:01.911533 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:08:01.916392 master-0 kubenswrapper[9368]: I1203 20:08:01.916334 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bs7zk"] Dec 03 20:08:02.545117 master-0 kubenswrapper[9368]: I1203 20:08:02.545046 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:08:02.545538 master-0 kubenswrapper[9368]: I1203 20:08:02.545317 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:08:02.545538 master-0 kubenswrapper[9368]: E1203 20:08:02.545391 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-lsltt_openshift-apiserver-operator(d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" podUID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" Dec 03 20:08:02.546194 master-0 kubenswrapper[9368]: E1203 20:08:02.546127 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-p7xj5_openshift-kube-storage-version-migrator-operator(11e2c94f-f9e9-415b-a550-3006a4632ba4)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" podUID="11e2c94f-f9e9-415b-a550-3006a4632ba4" Dec 03 20:08:02.569855 master-0 kubenswrapper[9368]: I1203 20:08:02.569744 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" path="/var/lib/kubelet/pods/4d5dddf7-f9a3-4169-89bd-dedee0aacdd9/volumes" Dec 03 20:08:02.572149 master-0 kubenswrapper[9368]: I1203 20:08:02.572104 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d164eb04-3911-486e-a66b-828098a04e9f" path="/var/lib/kubelet/pods/d164eb04-3911-486e-a66b-828098a04e9f/volumes" Dec 03 20:08:03.303014 master-0 kubenswrapper[9368]: I1203 20:08:03.302944 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:08:03.303233 master-0 kubenswrapper[9368]: I1203 20:08:03.303184 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qnhnn" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="registry-server" containerID="cri-o://527fa36c882adc00ec5671ba7a7a7e111d413604162f96e6c1645a6e9c47f1c1" gracePeriod=2 Dec 03 20:08:03.544949 master-0 kubenswrapper[9368]: I1203 20:08:03.544845 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:08:03.545231 master-0 kubenswrapper[9368]: I1203 20:08:03.545174 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:08:03.545436 master-0 kubenswrapper[9368]: E1203 20:08:03.545199 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-9t9fn_openshift-kube-apiserver-operator(943feb0d-7d31-446a-9100-dfc4ef013d12)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" podUID="943feb0d-7d31-446a-9100-dfc4ef013d12" Dec 03 20:08:03.545605 master-0 kubenswrapper[9368]: E1203 20:08:03.545538 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=network-operator pod=network-operator-6cbf58c977-w7d8t_openshift-network-operator(6eb4700c-6af0-468b-afc8-1e09b902d6bf)\"" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" podUID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" Dec 03 20:08:03.592668 master-0 kubenswrapper[9368]: I1203 20:08:03.592495 9368 generic.go:334] "Generic (PLEG): container finished" podID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerID="527fa36c882adc00ec5671ba7a7a7e111d413604162f96e6c1645a6e9c47f1c1" exitCode=0 Dec 03 20:08:03.592668 master-0 kubenswrapper[9368]: I1203 20:08:03.592546 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerDied","Data":"527fa36c882adc00ec5671ba7a7a7e111d413604162f96e6c1645a6e9c47f1c1"} Dec 03 20:08:04.070028 master-0 kubenswrapper[9368]: I1203 20:08:04.069964 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:04.229584 master-0 kubenswrapper[9368]: I1203 20:08:04.229335 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qv2\" (UniqueName: \"kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2\") pod \"190cec37-48ec-49ed-b954-0874a0adaa6b\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " Dec 03 20:08:04.229584 master-0 kubenswrapper[9368]: I1203 20:08:04.229576 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content\") pod \"190cec37-48ec-49ed-b954-0874a0adaa6b\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " Dec 03 20:08:04.230183 master-0 kubenswrapper[9368]: I1203 20:08:04.229666 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities\") pod \"190cec37-48ec-49ed-b954-0874a0adaa6b\" (UID: \"190cec37-48ec-49ed-b954-0874a0adaa6b\") " Dec 03 20:08:04.231658 master-0 kubenswrapper[9368]: I1203 20:08:04.231572 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities" (OuterVolumeSpecName: "utilities") pod "190cec37-48ec-49ed-b954-0874a0adaa6b" (UID: "190cec37-48ec-49ed-b954-0874a0adaa6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:04.234959 master-0 kubenswrapper[9368]: I1203 20:08:04.234912 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2" (OuterVolumeSpecName: "kube-api-access-m5qv2") pod "190cec37-48ec-49ed-b954-0874a0adaa6b" (UID: "190cec37-48ec-49ed-b954-0874a0adaa6b"). InnerVolumeSpecName "kube-api-access-m5qv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:04.252500 master-0 kubenswrapper[9368]: I1203 20:08:04.252428 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "190cec37-48ec-49ed-b954-0874a0adaa6b" (UID: "190cec37-48ec-49ed-b954-0874a0adaa6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:04.331758 master-0 kubenswrapper[9368]: I1203 20:08:04.331660 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:04.331758 master-0 kubenswrapper[9368]: I1203 20:08:04.331706 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190cec37-48ec-49ed-b954-0874a0adaa6b-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:04.331758 master-0 kubenswrapper[9368]: I1203 20:08:04.331719 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qv2\" (UniqueName: \"kubernetes.io/projected/190cec37-48ec-49ed-b954-0874a0adaa6b-kube-api-access-m5qv2\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:04.606279 master-0 kubenswrapper[9368]: I1203 20:08:04.606181 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnhnn" event={"ID":"190cec37-48ec-49ed-b954-0874a0adaa6b","Type":"ContainerDied","Data":"69e77e77670a597953c26f68f2df83dffd82b040457787598e8a59bb203a8f46"} Dec 03 20:08:04.606279 master-0 kubenswrapper[9368]: I1203 20:08:04.606259 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnhnn" Dec 03 20:08:04.607153 master-0 kubenswrapper[9368]: I1203 20:08:04.606266 9368 scope.go:117] "RemoveContainer" containerID="527fa36c882adc00ec5671ba7a7a7e111d413604162f96e6c1645a6e9c47f1c1" Dec 03 20:08:04.629621 master-0 kubenswrapper[9368]: I1203 20:08:04.629471 9368 scope.go:117] "RemoveContainer" containerID="e9c08afd3d28191fbb6f509fcc0f2344c9eccd21542f10fe794a334e7ec795da" Dec 03 20:08:04.655108 master-0 kubenswrapper[9368]: I1203 20:08:04.654857 9368 scope.go:117] "RemoveContainer" containerID="7d1e43b364132c9e6375ff146b7933b43cca5922fef062c3e7e4a746d696b9bb" Dec 03 20:08:05.936657 master-0 kubenswrapper[9368]: I1203 20:08:05.936559 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:08:05.944303 master-0 kubenswrapper[9368]: I1203 20:08:05.944229 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnhnn"] Dec 03 20:08:06.544291 master-0 kubenswrapper[9368]: I1203 20:08:06.544244 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:08:06.544889 master-0 kubenswrapper[9368]: E1203 20:08:06.544865 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:08:06.550427 master-0 kubenswrapper[9368]: I1203 20:08:06.550361 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" path="/var/lib/kubelet/pods/190cec37-48ec-49ed-b954-0874a0adaa6b/volumes" Dec 03 20:08:07.703796 master-0 kubenswrapper[9368]: I1203 20:08:07.703716 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 20:08:07.704414 master-0 kubenswrapper[9368]: I1203 20:08:07.704077 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r2c8x" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="registry-server" containerID="cri-o://31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa" gracePeriod=2 Dec 03 20:08:07.907551 master-0 kubenswrapper[9368]: I1203 20:08:07.907234 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 20:08:07.907664 master-0 kubenswrapper[9368]: I1203 20:08:07.907573 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mc8kx" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="registry-server" containerID="cri-o://2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" gracePeriod=2 Dec 03 20:08:08.106583 master-0 kubenswrapper[9368]: I1203 20:08:08.106541 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:08:08.113133 master-0 kubenswrapper[9368]: I1203 20:08:08.113063 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98lh5"] Dec 03 20:08:08.113343 master-0 kubenswrapper[9368]: E1203 20:08:08.113315 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="extract-content" Dec 03 20:08:08.113343 master-0 kubenswrapper[9368]: I1203 20:08:08.113340 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="extract-content" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: E1203 20:08:08.113354 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="extract-utilities" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: I1203 20:08:08.113361 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="extract-utilities" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: E1203 20:08:08.113371 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: I1203 20:08:08.113378 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: E1203 20:08:08.113389 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: I1203 20:08:08.113395 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: E1203 20:08:08.113406 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: I1203 20:08:08.113413 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="registry-server" Dec 03 20:08:08.113418 master-0 kubenswrapper[9368]: E1203 20:08:08.113423 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="registry-server" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113430 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="registry-server" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113438 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113444 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113460 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113466 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113479 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113490 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113505 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113512 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113527 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113534 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="extract-content" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: E1203 20:08:08.113546 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113553 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="extract-utilities" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113682 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5dddf7-f9a3-4169-89bd-dedee0aacdd9" containerName="registry-server" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113694 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="d164eb04-3911-486e-a66b-828098a04e9f" containerName="registry-server" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113703 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="190cec37-48ec-49ed-b954-0874a0adaa6b" containerName="registry-server" Dec 03 20:08:08.113739 master-0 kubenswrapper[9368]: I1203 20:08:08.113718 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" containerName="registry-server" Dec 03 20:08:08.114584 master-0 kubenswrapper[9368]: I1203 20:08:08.114554 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.116725 master-0 kubenswrapper[9368]: I1203 20:08:08.116653 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-l56l4" Dec 03 20:08:08.125561 master-0 kubenswrapper[9368]: I1203 20:08:08.125410 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lh5"] Dec 03 20:08:08.134533 master-0 kubenswrapper[9368]: E1203 20:08:08.134451 9368 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 is running failed: container process not found" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 20:08:08.136076 master-0 kubenswrapper[9368]: E1203 20:08:08.135446 9368 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 is running failed: container process not found" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 20:08:08.136076 master-0 kubenswrapper[9368]: E1203 20:08:08.135836 9368 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 is running failed: container process not found" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 20:08:08.136076 master-0 kubenswrapper[9368]: E1203 20:08:08.135871 9368 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-mc8kx" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="registry-server" Dec 03 20:08:08.280368 master-0 kubenswrapper[9368]: I1203 20:08:08.280244 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:08:08.280553 master-0 kubenswrapper[9368]: I1203 20:08:08.280448 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities\") pod \"acb1d894-1bc0-478d-87fc-e9137291df70\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " Dec 03 20:08:08.280553 master-0 kubenswrapper[9368]: I1203 20:08:08.280496 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content\") pod \"acb1d894-1bc0-478d-87fc-e9137291df70\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.280613 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7hr\" (UniqueName: \"kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr\") pod \"acb1d894-1bc0-478d-87fc-e9137291df70\" (UID: \"acb1d894-1bc0-478d-87fc-e9137291df70\") " Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.280757 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities\") pod \"81839b26-cf66-4532-a646-ef4cd5d5e471\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.280824 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content\") pod \"81839b26-cf66-4532-a646-ef4cd5d5e471\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.280920 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bdn5\" (UniqueName: \"kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.280988 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.281367 master-0 kubenswrapper[9368]: I1203 20:08:08.281052 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.281705 master-0 kubenswrapper[9368]: I1203 20:08:08.281456 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities" (OuterVolumeSpecName: "utilities") pod "acb1d894-1bc0-478d-87fc-e9137291df70" (UID: "acb1d894-1bc0-478d-87fc-e9137291df70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:08.283043 master-0 kubenswrapper[9368]: I1203 20:08:08.282266 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities" (OuterVolumeSpecName: "utilities") pod "81839b26-cf66-4532-a646-ef4cd5d5e471" (UID: "81839b26-cf66-4532-a646-ef4cd5d5e471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:08.285665 master-0 kubenswrapper[9368]: I1203 20:08:08.284592 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr" (OuterVolumeSpecName: "kube-api-access-pl7hr") pod "acb1d894-1bc0-478d-87fc-e9137291df70" (UID: "acb1d894-1bc0-478d-87fc-e9137291df70"). InnerVolumeSpecName "kube-api-access-pl7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:08.306711 master-0 kubenswrapper[9368]: I1203 20:08:08.306409 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81839b26-cf66-4532-a646-ef4cd5d5e471" (UID: "81839b26-cf66-4532-a646-ef4cd5d5e471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.320862 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnrx"] Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: E1203 20:08:08.321249 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="extract-utilities" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.321269 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="extract-utilities" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: E1203 20:08:08.321302 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="extract-content" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.321309 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="extract-content" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: E1203 20:08:08.321316 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="registry-server" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.321323 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="registry-server" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.321482 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerName="registry-server" Dec 03 20:08:08.323907 master-0 kubenswrapper[9368]: I1203 20:08:08.322410 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.326310 master-0 kubenswrapper[9368]: I1203 20:08:08.325358 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-7tjv7" Dec 03 20:08:08.334874 master-0 kubenswrapper[9368]: I1203 20:08:08.334812 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnrx"] Dec 03 20:08:08.339888 master-0 kubenswrapper[9368]: I1203 20:08:08.339834 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acb1d894-1bc0-478d-87fc-e9137291df70" (UID: "acb1d894-1bc0-478d-87fc-e9137291df70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:08.382508 master-0 kubenswrapper[9368]: I1203 20:08:08.382454 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckcc5\" (UniqueName: \"kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5\") pod \"81839b26-cf66-4532-a646-ef4cd5d5e471\" (UID: \"81839b26-cf66-4532-a646-ef4cd5d5e471\") " Dec 03 20:08:08.382676 master-0 kubenswrapper[9368]: I1203 20:08:08.382555 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.382676 master-0 kubenswrapper[9368]: I1203 20:08:08.382600 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.382759 master-0 kubenswrapper[9368]: I1203 20:08:08.382624 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trv6b\" (UniqueName: \"kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.382759 master-0 kubenswrapper[9368]: I1203 20:08:08.382713 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdn5\" (UniqueName: \"kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.382885 master-0 kubenswrapper[9368]: I1203 20:08:08.382863 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.382949 master-0 kubenswrapper[9368]: I1203 20:08:08.382903 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.383369 master-0 kubenswrapper[9368]: I1203 20:08:08.383274 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7hr\" (UniqueName: \"kubernetes.io/projected/acb1d894-1bc0-478d-87fc-e9137291df70-kube-api-access-pl7hr\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.383422 master-0 kubenswrapper[9368]: I1203 20:08:08.383373 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.383422 master-0 kubenswrapper[9368]: I1203 20:08:08.383301 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.383483 master-0 kubenswrapper[9368]: I1203 20:08:08.383391 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81839b26-cf66-4532-a646-ef4cd5d5e471-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.383483 master-0 kubenswrapper[9368]: I1203 20:08:08.383473 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.383546 master-0 kubenswrapper[9368]: I1203 20:08:08.383488 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb1d894-1bc0-478d-87fc-e9137291df70-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.383546 master-0 kubenswrapper[9368]: I1203 20:08:08.383337 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.385383 master-0 kubenswrapper[9368]: I1203 20:08:08.385353 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5" (OuterVolumeSpecName: "kube-api-access-ckcc5") pod "81839b26-cf66-4532-a646-ef4cd5d5e471" (UID: "81839b26-cf66-4532-a646-ef4cd5d5e471"). InnerVolumeSpecName "kube-api-access-ckcc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:08.396789 master-0 kubenswrapper[9368]: I1203 20:08:08.396726 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdn5\" (UniqueName: \"kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.451403 master-0 kubenswrapper[9368]: I1203 20:08:08.451352 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:08.484349 master-0 kubenswrapper[9368]: I1203 20:08:08.484304 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.484466 master-0 kubenswrapper[9368]: I1203 20:08:08.484374 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.484466 master-0 kubenswrapper[9368]: I1203 20:08:08.484402 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv6b\" (UniqueName: \"kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.484466 master-0 kubenswrapper[9368]: I1203 20:08:08.484448 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckcc5\" (UniqueName: \"kubernetes.io/projected/81839b26-cf66-4532-a646-ef4cd5d5e471-kube-api-access-ckcc5\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:08.485120 master-0 kubenswrapper[9368]: I1203 20:08:08.485059 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.485626 master-0 kubenswrapper[9368]: I1203 20:08:08.485561 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.504889 master-0 kubenswrapper[9368]: I1203 20:08:08.504813 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv6b\" (UniqueName: \"kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.544131 master-0 kubenswrapper[9368]: I1203 20:08:08.544091 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:08:08.544527 master-0 kubenswrapper[9368]: I1203 20:08:08.544476 9368 scope.go:117] "RemoveContainer" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" Dec 03 20:08:08.633646 master-0 kubenswrapper[9368]: I1203 20:08:08.633586 9368 generic.go:334] "Generic (PLEG): container finished" podID="81839b26-cf66-4532-a646-ef4cd5d5e471" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" exitCode=0 Dec 03 20:08:08.633823 master-0 kubenswrapper[9368]: I1203 20:08:08.633658 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerDied","Data":"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02"} Dec 03 20:08:08.633823 master-0 kubenswrapper[9368]: I1203 20:08:08.633688 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc8kx" event={"ID":"81839b26-cf66-4532-a646-ef4cd5d5e471","Type":"ContainerDied","Data":"148d3d0ae63a175305173f860008d660572daa7838487974f2a9f003f59eeff0"} Dec 03 20:08:08.633823 master-0 kubenswrapper[9368]: I1203 20:08:08.633708 9368 scope.go:117] "RemoveContainer" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" Dec 03 20:08:08.633929 master-0 kubenswrapper[9368]: I1203 20:08:08.633850 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc8kx" Dec 03 20:08:08.639135 master-0 kubenswrapper[9368]: I1203 20:08:08.639088 9368 generic.go:334] "Generic (PLEG): container finished" podID="acb1d894-1bc0-478d-87fc-e9137291df70" containerID="31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa" exitCode=0 Dec 03 20:08:08.639201 master-0 kubenswrapper[9368]: I1203 20:08:08.639135 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerDied","Data":"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa"} Dec 03 20:08:08.639201 master-0 kubenswrapper[9368]: I1203 20:08:08.639165 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2c8x" event={"ID":"acb1d894-1bc0-478d-87fc-e9137291df70","Type":"ContainerDied","Data":"71864668e71ee0adcfe271632cee980c0921d9d37de64e40c034340e1013deba"} Dec 03 20:08:08.639318 master-0 kubenswrapper[9368]: I1203 20:08:08.639281 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2c8x" Dec 03 20:08:08.642546 master-0 kubenswrapper[9368]: I1203 20:08:08.642505 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:08.678223 master-0 kubenswrapper[9368]: I1203 20:08:08.678190 9368 scope.go:117] "RemoveContainer" containerID="56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2" Dec 03 20:08:08.691716 master-0 kubenswrapper[9368]: I1203 20:08:08.691662 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 20:08:08.697051 master-0 kubenswrapper[9368]: I1203 20:08:08.697004 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc8kx"] Dec 03 20:08:08.720865 master-0 kubenswrapper[9368]: I1203 20:08:08.720607 9368 scope.go:117] "RemoveContainer" containerID="6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057" Dec 03 20:08:08.720865 master-0 kubenswrapper[9368]: I1203 20:08:08.720609 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 20:08:08.727748 master-0 kubenswrapper[9368]: I1203 20:08:08.727710 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r2c8x"] Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: I1203 20:08:08.746587 9368 scope.go:117] "RemoveContainer" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: E1203 20:08:08.747308 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02\": container with ID starting with 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 not found: ID does not exist" containerID="2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: I1203 20:08:08.747358 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02"} err="failed to get container status \"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02\": rpc error: code = NotFound desc = could not find container \"2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02\": container with ID starting with 2d0d8f4d1a2c0f51353ab915e52caebd8ecfef564e81b5fd017cad2cb0718e02 not found: ID does not exist" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: I1203 20:08:08.747384 9368 scope.go:117] "RemoveContainer" containerID="56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: E1203 20:08:08.747851 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2\": container with ID starting with 56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2 not found: ID does not exist" containerID="56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: I1203 20:08:08.747871 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2"} err="failed to get container status \"56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2\": rpc error: code = NotFound desc = could not find container \"56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2\": container with ID starting with 56dbd4722d7e9613178a67c106fd164ecc8009c6b4f5a3da4ca79cccc369cdb2 not found: ID does not exist" Dec 03 20:08:08.748052 master-0 kubenswrapper[9368]: I1203 20:08:08.747887 9368 scope.go:117] "RemoveContainer" containerID="6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057" Dec 03 20:08:08.748303 master-0 kubenswrapper[9368]: E1203 20:08:08.748230 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057\": container with ID starting with 6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057 not found: ID does not exist" containerID="6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057" Dec 03 20:08:08.748303 master-0 kubenswrapper[9368]: I1203 20:08:08.748247 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057"} err="failed to get container status \"6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057\": rpc error: code = NotFound desc = could not find container \"6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057\": container with ID starting with 6a1397a67e232c3bd544bba3152a421dfc26a504ea7d093a2315998dd96b8057 not found: ID does not exist" Dec 03 20:08:08.748303 master-0 kubenswrapper[9368]: I1203 20:08:08.748258 9368 scope.go:117] "RemoveContainer" containerID="31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa" Dec 03 20:08:08.769918 master-0 kubenswrapper[9368]: I1203 20:08:08.769883 9368 scope.go:117] "RemoveContainer" containerID="206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9" Dec 03 20:08:08.801462 master-0 kubenswrapper[9368]: I1203 20:08:08.801419 9368 scope.go:117] "RemoveContainer" containerID="379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.824987 9368 scope.go:117] "RemoveContainer" containerID="31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: E1203 20:08:08.825720 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa\": container with ID starting with 31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa not found: ID does not exist" containerID="31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.825750 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa"} err="failed to get container status \"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa\": rpc error: code = NotFound desc = could not find container \"31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa\": container with ID starting with 31c4e29d2c72aa9bd8313b56ca01fa882e7efb110999dd587df1e8251c22b8fa not found: ID does not exist" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.825770 9368 scope.go:117] "RemoveContainer" containerID="206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: E1203 20:08:08.826129 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9\": container with ID starting with 206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9 not found: ID does not exist" containerID="206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.826148 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9"} err="failed to get container status \"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9\": rpc error: code = NotFound desc = could not find container \"206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9\": container with ID starting with 206ac88753a216a598da6c64c40223604a0df8c6bc77dd259579b973bbcfb8a9 not found: ID does not exist" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.826162 9368 scope.go:117] "RemoveContainer" containerID="379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: E1203 20:08:08.826697 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9\": container with ID starting with 379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9 not found: ID does not exist" containerID="379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9" Dec 03 20:08:08.827169 master-0 kubenswrapper[9368]: I1203 20:08:08.826755 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9"} err="failed to get container status \"379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9\": rpc error: code = NotFound desc = could not find container \"379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9\": container with ID starting with 379e67ab2df011ed3e1a17dd2f3396c98a79650720f2401a156beabfaf028de9 not found: ID does not exist" Dec 03 20:08:08.881024 master-0 kubenswrapper[9368]: I1203 20:08:08.876478 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98lh5"] Dec 03 20:08:08.881024 master-0 kubenswrapper[9368]: W1203 20:08:08.878322 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5d6b8a_9fd1_4bd3_8b74_2d634caf7db2.slice/crio-b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034 WatchSource:0}: Error finding container b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034: Status 404 returned error can't find the container with id b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034 Dec 03 20:08:09.117591 master-0 kubenswrapper[9368]: I1203 20:08:09.115980 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnrx"] Dec 03 20:08:09.655836 master-0 kubenswrapper[9368]: I1203 20:08:09.655741 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/4.log" Dec 03 20:08:09.656252 master-0 kubenswrapper[9368]: I1203 20:08:09.655901 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" event={"ID":"daa8efc0-4514-4a14-80f5-ab9eca53a127","Type":"ContainerStarted","Data":"6ec56fc4875ae3baa0bcabe3e133ea613a6677191a01ad2df749d3ad1f2b41d9"} Dec 03 20:08:09.660304 master-0 kubenswrapper[9368]: I1203 20:08:09.660248 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83"} Dec 03 20:08:09.662810 master-0 kubenswrapper[9368]: I1203 20:08:09.662721 9368 generic.go:334] "Generic (PLEG): container finished" podID="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" containerID="199eaf1616b5ae06926193f7d4e723c00bcb81929b670fb413bd36d7bf6e1d63" exitCode=0 Dec 03 20:08:09.662810 master-0 kubenswrapper[9368]: I1203 20:08:09.662772 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lh5" event={"ID":"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2","Type":"ContainerDied","Data":"199eaf1616b5ae06926193f7d4e723c00bcb81929b670fb413bd36d7bf6e1d63"} Dec 03 20:08:09.663063 master-0 kubenswrapper[9368]: I1203 20:08:09.662832 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lh5" event={"ID":"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2","Type":"ContainerStarted","Data":"b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034"} Dec 03 20:08:09.666140 master-0 kubenswrapper[9368]: I1203 20:08:09.666103 9368 generic.go:334] "Generic (PLEG): container finished" podID="b638f207-31df-4298-8801-4da6031deefc" containerID="faf3a48e7c674daa85ae24cd3640d8c54a246a72784d7207fb68637d0b2401d5" exitCode=0 Dec 03 20:08:09.666278 master-0 kubenswrapper[9368]: I1203 20:08:09.666160 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnrx" event={"ID":"b638f207-31df-4298-8801-4da6031deefc","Type":"ContainerDied","Data":"faf3a48e7c674daa85ae24cd3640d8c54a246a72784d7207fb68637d0b2401d5"} Dec 03 20:08:09.666278 master-0 kubenswrapper[9368]: I1203 20:08:09.666178 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnrx" event={"ID":"b638f207-31df-4298-8801-4da6031deefc","Type":"ContainerStarted","Data":"b0f89725c2a6c3514238a4cc365a81c3b56d37ffea32d9d0a2e9a1e91fecf2fb"} Dec 03 20:08:10.101811 master-0 kubenswrapper[9368]: I1203 20:08:10.101691 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:10.138272 master-0 kubenswrapper[9368]: I1203 20:08:10.138210 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:10.509867 master-0 kubenswrapper[9368]: I1203 20:08:10.509806 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 20:08:10.510507 master-0 kubenswrapper[9368]: I1203 20:08:10.510467 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sp868" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="registry-server" containerID="cri-o://63e96daa282dbc7d024e787ccf340beb8400981b7e21cf7891ecde2dd88c97bf" gracePeriod=2 Dec 03 20:08:10.556219 master-0 kubenswrapper[9368]: I1203 20:08:10.556170 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81839b26-cf66-4532-a646-ef4cd5d5e471" path="/var/lib/kubelet/pods/81839b26-cf66-4532-a646-ef4cd5d5e471/volumes" Dec 03 20:08:10.557474 master-0 kubenswrapper[9368]: I1203 20:08:10.557440 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb1d894-1bc0-478d-87fc-e9137291df70" path="/var/lib/kubelet/pods/acb1d894-1bc0-478d-87fc-e9137291df70/volumes" Dec 03 20:08:10.683204 master-0 kubenswrapper[9368]: I1203 20:08:10.683080 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lh5" event={"ID":"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2","Type":"ContainerStarted","Data":"aa9d0cc86210e7d9335ee33dc0e24caf30866ce853c547c220c347b3bc7052c9"} Dec 03 20:08:10.689093 master-0 kubenswrapper[9368]: I1203 20:08:10.688994 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerDied","Data":"63e96daa282dbc7d024e787ccf340beb8400981b7e21cf7891ecde2dd88c97bf"} Dec 03 20:08:10.689463 master-0 kubenswrapper[9368]: I1203 20:08:10.688964 9368 generic.go:334] "Generic (PLEG): container finished" podID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerID="63e96daa282dbc7d024e787ccf340beb8400981b7e21cf7891ecde2dd88c97bf" exitCode=0 Dec 03 20:08:10.707957 master-0 kubenswrapper[9368]: I1203 20:08:10.707727 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 20:08:10.708503 master-0 kubenswrapper[9368]: I1203 20:08:10.708320 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6zrxk" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="registry-server" containerID="cri-o://ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e" gracePeriod=2 Dec 03 20:08:10.917467 master-0 kubenswrapper[9368]: I1203 20:08:10.916693 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mg96g"] Dec 03 20:08:10.918549 master-0 kubenswrapper[9368]: I1203 20:08:10.918327 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:10.920620 master-0 kubenswrapper[9368]: I1203 20:08:10.920414 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-krxhq" Dec 03 20:08:10.922209 master-0 kubenswrapper[9368]: I1203 20:08:10.922021 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kszjr\" (UniqueName: \"kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:10.922209 master-0 kubenswrapper[9368]: I1203 20:08:10.922112 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:10.922423 master-0 kubenswrapper[9368]: I1203 20:08:10.922244 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:10.935728 master-0 kubenswrapper[9368]: I1203 20:08:10.935672 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg96g"] Dec 03 20:08:11.024244 master-0 kubenswrapper[9368]: I1203 20:08:11.024178 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.024446 master-0 kubenswrapper[9368]: I1203 20:08:11.024272 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kszjr\" (UniqueName: \"kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.024446 master-0 kubenswrapper[9368]: I1203 20:08:11.024324 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.025807 master-0 kubenswrapper[9368]: I1203 20:08:11.025747 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.026300 master-0 kubenswrapper[9368]: I1203 20:08:11.026257 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.050839 master-0 kubenswrapper[9368]: I1203 20:08:11.050790 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kszjr\" (UniqueName: \"kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.073404 master-0 kubenswrapper[9368]: I1203 20:08:11.073359 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sp868" Dec 03 20:08:11.085749 master-0 kubenswrapper[9368]: I1203 20:08:11.085717 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: I1203 20:08:11.112110 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9smb5"] Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: E1203 20:08:11.112334 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="extract-utilities" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: I1203 20:08:11.112346 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="extract-utilities" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: E1203 20:08:11.112361 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="registry-server" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: I1203 20:08:11.112367 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="registry-server" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: E1203 20:08:11.112383 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="extract-content" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: I1203 20:08:11.112389 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="extract-content" Dec 03 20:08:11.113018 master-0 kubenswrapper[9368]: I1203 20:08:11.112489 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" containerName="registry-server" Dec 03 20:08:11.114434 master-0 kubenswrapper[9368]: I1203 20:08:11.114358 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.117466 master-0 kubenswrapper[9368]: I1203 20:08:11.117420 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ngglc" Dec 03 20:08:11.132089 master-0 kubenswrapper[9368]: I1203 20:08:11.132031 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9smb5"] Dec 03 20:08:11.168354 master-0 kubenswrapper[9368]: I1203 20:08:11.168255 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:08:11.227208 master-0 kubenswrapper[9368]: I1203 20:08:11.227149 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities\") pod \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " Dec 03 20:08:11.227309 master-0 kubenswrapper[9368]: I1203 20:08:11.227232 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssvzh\" (UniqueName: \"kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh\") pod \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " Dec 03 20:08:11.227353 master-0 kubenswrapper[9368]: I1203 20:08:11.227326 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content\") pod \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\" (UID: \"48dfa48e-caea-4017-bd3e-d1da8bcd2da7\") " Dec 03 20:08:11.227645 master-0 kubenswrapper[9368]: I1203 20:08:11.227602 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmkc\" (UniqueName: \"kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.227701 master-0 kubenswrapper[9368]: I1203 20:08:11.227670 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.227794 master-0 kubenswrapper[9368]: I1203 20:08:11.227744 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.228666 master-0 kubenswrapper[9368]: I1203 20:08:11.228612 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities" (OuterVolumeSpecName: "utilities") pod "48dfa48e-caea-4017-bd3e-d1da8bcd2da7" (UID: "48dfa48e-caea-4017-bd3e-d1da8bcd2da7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:11.231883 master-0 kubenswrapper[9368]: I1203 20:08:11.231833 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh" (OuterVolumeSpecName: "kube-api-access-ssvzh") pod "48dfa48e-caea-4017-bd3e-d1da8bcd2da7" (UID: "48dfa48e-caea-4017-bd3e-d1da8bcd2da7"). InnerVolumeSpecName "kube-api-access-ssvzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:11.285254 master-0 kubenswrapper[9368]: I1203 20:08:11.285178 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48dfa48e-caea-4017-bd3e-d1da8bcd2da7" (UID: "48dfa48e-caea-4017-bd3e-d1da8bcd2da7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:11.328300 master-0 kubenswrapper[9368]: I1203 20:08:11.328251 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities\") pod \"af6f6483-5ca1-48b7-90b5-b03d460d041a\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328333 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content\") pod \"af6f6483-5ca1-48b7-90b5-b03d460d041a\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328377 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwtpl\" (UniqueName: \"kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl\") pod \"af6f6483-5ca1-48b7-90b5-b03d460d041a\" (UID: \"af6f6483-5ca1-48b7-90b5-b03d460d041a\") " Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328576 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmkc\" (UniqueName: \"kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328609 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328647 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328702 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328714 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssvzh\" (UniqueName: \"kubernetes.io/projected/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-kube-api-access-ssvzh\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.328860 master-0 kubenswrapper[9368]: I1203 20:08:11.328722 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48dfa48e-caea-4017-bd3e-d1da8bcd2da7-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.329244 master-0 kubenswrapper[9368]: I1203 20:08:11.329129 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.331068 master-0 kubenswrapper[9368]: I1203 20:08:11.330116 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.331326 master-0 kubenswrapper[9368]: I1203 20:08:11.331259 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities" (OuterVolumeSpecName: "utilities") pod "af6f6483-5ca1-48b7-90b5-b03d460d041a" (UID: "af6f6483-5ca1-48b7-90b5-b03d460d041a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:11.332665 master-0 kubenswrapper[9368]: I1203 20:08:11.332598 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl" (OuterVolumeSpecName: "kube-api-access-mwtpl") pod "af6f6483-5ca1-48b7-90b5-b03d460d041a" (UID: "af6f6483-5ca1-48b7-90b5-b03d460d041a"). InnerVolumeSpecName "kube-api-access-mwtpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:11.335248 master-0 kubenswrapper[9368]: I1203 20:08:11.335187 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg96g"] Dec 03 20:08:11.338823 master-0 kubenswrapper[9368]: W1203 20:08:11.338763 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb19329_c50c_4214_94c8_7e8771b99233.slice/crio-06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd WatchSource:0}: Error finding container 06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd: Status 404 returned error can't find the container with id 06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd Dec 03 20:08:11.350469 master-0 kubenswrapper[9368]: I1203 20:08:11.350411 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmkc\" (UniqueName: \"kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.430606 master-0 kubenswrapper[9368]: I1203 20:08:11.430549 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwtpl\" (UniqueName: \"kubernetes.io/projected/af6f6483-5ca1-48b7-90b5-b03d460d041a-kube-api-access-mwtpl\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.430697 master-0 kubenswrapper[9368]: I1203 20:08:11.430611 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.444628 master-0 kubenswrapper[9368]: I1203 20:08:11.444521 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af6f6483-5ca1-48b7-90b5-b03d460d041a" (UID: "af6f6483-5ca1-48b7-90b5-b03d460d041a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:11.447449 master-0 kubenswrapper[9368]: I1203 20:08:11.447388 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:11.503692 master-0 kubenswrapper[9368]: I1203 20:08:11.503651 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:08:11.532024 master-0 kubenswrapper[9368]: I1203 20:08:11.531964 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6f6483-5ca1-48b7-90b5-b03d460d041a-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:11.700198 master-0 kubenswrapper[9368]: I1203 20:08:11.700074 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sp868" event={"ID":"48dfa48e-caea-4017-bd3e-d1da8bcd2da7","Type":"ContainerDied","Data":"97db26d863bf0ebdc932c5639db85fc3842260317e397675b72f82e6a0ecb736"} Dec 03 20:08:11.700198 master-0 kubenswrapper[9368]: I1203 20:08:11.700132 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sp868" Dec 03 20:08:11.700409 master-0 kubenswrapper[9368]: I1203 20:08:11.700141 9368 scope.go:117] "RemoveContainer" containerID="63e96daa282dbc7d024e787ccf340beb8400981b7e21cf7891ecde2dd88c97bf" Dec 03 20:08:11.702630 master-0 kubenswrapper[9368]: I1203 20:08:11.702573 9368 generic.go:334] "Generic (PLEG): container finished" podID="b638f207-31df-4298-8801-4da6031deefc" containerID="d2c1886a2860f8a9cfc62feb851502428fec91b03a3c1244620b2a342cd94941" exitCode=0 Dec 03 20:08:11.702703 master-0 kubenswrapper[9368]: I1203 20:08:11.702670 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnrx" event={"ID":"b638f207-31df-4298-8801-4da6031deefc","Type":"ContainerDied","Data":"d2c1886a2860f8a9cfc62feb851502428fec91b03a3c1244620b2a342cd94941"} Dec 03 20:08:11.704810 master-0 kubenswrapper[9368]: I1203 20:08:11.704731 9368 generic.go:334] "Generic (PLEG): container finished" podID="6bb19329-c50c-4214-94c8-7e8771b99233" containerID="13bde77208cb39b575d114d9d173756c7e7bb201950243c772caea7e6104ce2d" exitCode=0 Dec 03 20:08:11.704897 master-0 kubenswrapper[9368]: I1203 20:08:11.704872 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg96g" event={"ID":"6bb19329-c50c-4214-94c8-7e8771b99233","Type":"ContainerDied","Data":"13bde77208cb39b575d114d9d173756c7e7bb201950243c772caea7e6104ce2d"} Dec 03 20:08:11.704931 master-0 kubenswrapper[9368]: I1203 20:08:11.704905 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg96g" event={"ID":"6bb19329-c50c-4214-94c8-7e8771b99233","Type":"ContainerStarted","Data":"06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd"} Dec 03 20:08:11.707690 master-0 kubenswrapper[9368]: I1203 20:08:11.707656 9368 generic.go:334] "Generic (PLEG): container finished" podID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerID="ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e" exitCode=0 Dec 03 20:08:11.707793 master-0 kubenswrapper[9368]: I1203 20:08:11.707709 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerDied","Data":"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e"} Dec 03 20:08:11.707793 master-0 kubenswrapper[9368]: I1203 20:08:11.707752 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zrxk" Dec 03 20:08:11.707861 master-0 kubenswrapper[9368]: I1203 20:08:11.707767 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zrxk" event={"ID":"af6f6483-5ca1-48b7-90b5-b03d460d041a","Type":"ContainerDied","Data":"cd824949fd38781abd9d180d17b219a566658c17f085e62ea63db83c4ab6d2d5"} Dec 03 20:08:11.710568 master-0 kubenswrapper[9368]: I1203 20:08:11.710518 9368 generic.go:334] "Generic (PLEG): container finished" podID="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" containerID="aa9d0cc86210e7d9335ee33dc0e24caf30866ce853c547c220c347b3bc7052c9" exitCode=0 Dec 03 20:08:11.710654 master-0 kubenswrapper[9368]: I1203 20:08:11.710563 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lh5" event={"ID":"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2","Type":"ContainerDied","Data":"aa9d0cc86210e7d9335ee33dc0e24caf30866ce853c547c220c347b3bc7052c9"} Dec 03 20:08:11.710908 master-0 kubenswrapper[9368]: I1203 20:08:11.710880 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vnt24" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="registry-server" containerID="cri-o://bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0" gracePeriod=2 Dec 03 20:08:11.727309 master-0 kubenswrapper[9368]: I1203 20:08:11.727266 9368 scope.go:117] "RemoveContainer" containerID="1a9171c75fb14718020761f1f71ba22ead121950532b19c76cab72343f2fbba6" Dec 03 20:08:11.747549 master-0 kubenswrapper[9368]: I1203 20:08:11.747489 9368 scope.go:117] "RemoveContainer" containerID="11032c235e99b15636a3920f484a0a8fd50c568319f66ba43948c41f56636e33" Dec 03 20:08:11.781507 master-0 kubenswrapper[9368]: I1203 20:08:11.781422 9368 scope.go:117] "RemoveContainer" containerID="ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e" Dec 03 20:08:11.804585 master-0 kubenswrapper[9368]: I1203 20:08:11.804312 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 20:08:11.807444 master-0 kubenswrapper[9368]: I1203 20:08:11.807405 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6zrxk"] Dec 03 20:08:11.823959 master-0 kubenswrapper[9368]: I1203 20:08:11.823849 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 20:08:11.826191 master-0 kubenswrapper[9368]: I1203 20:08:11.826155 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sp868"] Dec 03 20:08:11.860912 master-0 kubenswrapper[9368]: I1203 20:08:11.860878 9368 scope.go:117] "RemoveContainer" containerID="c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9" Dec 03 20:08:11.880998 master-0 kubenswrapper[9368]: I1203 20:08:11.880612 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9smb5"] Dec 03 20:08:11.895048 master-0 kubenswrapper[9368]: I1203 20:08:11.894964 9368 scope.go:117] "RemoveContainer" containerID="e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3" Dec 03 20:08:11.899122 master-0 kubenswrapper[9368]: W1203 20:08:11.899092 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda710102c_72fb_4d8d_ad99_71940368a09e.slice/crio-28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87 WatchSource:0}: Error finding container 28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87: Status 404 returned error can't find the container with id 28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87 Dec 03 20:08:11.918451 master-0 kubenswrapper[9368]: I1203 20:08:11.918142 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:08:11.923097 master-0 kubenswrapper[9368]: I1203 20:08:11.923043 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:08:11.929399 master-0 kubenswrapper[9368]: I1203 20:08:11.929324 9368 scope.go:117] "RemoveContainer" containerID="ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e" Dec 03 20:08:11.930067 master-0 kubenswrapper[9368]: E1203 20:08:11.930033 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e\": container with ID starting with ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e not found: ID does not exist" containerID="ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e" Dec 03 20:08:11.930145 master-0 kubenswrapper[9368]: I1203 20:08:11.930076 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e"} err="failed to get container status \"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e\": rpc error: code = NotFound desc = could not find container \"ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e\": container with ID starting with ee56c77341ce8ae893027db16657f7afbcb30ba88b988af7846ab1aeb69a726e not found: ID does not exist" Dec 03 20:08:11.930145 master-0 kubenswrapper[9368]: I1203 20:08:11.930110 9368 scope.go:117] "RemoveContainer" containerID="c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9" Dec 03 20:08:11.930563 master-0 kubenswrapper[9368]: E1203 20:08:11.930528 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9\": container with ID starting with c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9 not found: ID does not exist" containerID="c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9" Dec 03 20:08:11.930728 master-0 kubenswrapper[9368]: I1203 20:08:11.930684 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9"} err="failed to get container status \"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9\": rpc error: code = NotFound desc = could not find container \"c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9\": container with ID starting with c2f11c6f26a2101a0f506435a5b72e6155823723906d7b9e852ca9417dd9baf9 not found: ID does not exist" Dec 03 20:08:11.930847 master-0 kubenswrapper[9368]: I1203 20:08:11.930830 9368 scope.go:117] "RemoveContainer" containerID="e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3" Dec 03 20:08:11.931403 master-0 kubenswrapper[9368]: E1203 20:08:11.931367 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3\": container with ID starting with e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3 not found: ID does not exist" containerID="e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3" Dec 03 20:08:11.931515 master-0 kubenswrapper[9368]: I1203 20:08:11.931496 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3"} err="failed to get container status \"e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3\": rpc error: code = NotFound desc = could not find container \"e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3\": container with ID starting with e1947338732792ae52be0e859ad4f708f515c228475b3b078f809c470c16a7d3 not found: ID does not exist" Dec 03 20:08:12.151144 master-0 kubenswrapper[9368]: I1203 20:08:12.151087 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:12.346835 master-0 kubenswrapper[9368]: I1203 20:08:12.346759 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content\") pod \"4b507554-0ccf-474e-b674-3546d174419d\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " Dec 03 20:08:12.346994 master-0 kubenswrapper[9368]: I1203 20:08:12.346837 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities\") pod \"4b507554-0ccf-474e-b674-3546d174419d\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " Dec 03 20:08:12.346994 master-0 kubenswrapper[9368]: I1203 20:08:12.346918 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25kbk\" (UniqueName: \"kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk\") pod \"4b507554-0ccf-474e-b674-3546d174419d\" (UID: \"4b507554-0ccf-474e-b674-3546d174419d\") " Dec 03 20:08:12.348344 master-0 kubenswrapper[9368]: I1203 20:08:12.348050 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities" (OuterVolumeSpecName: "utilities") pod "4b507554-0ccf-474e-b674-3546d174419d" (UID: "4b507554-0ccf-474e-b674-3546d174419d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:12.353669 master-0 kubenswrapper[9368]: I1203 20:08:12.351303 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk" (OuterVolumeSpecName: "kube-api-access-25kbk") pod "4b507554-0ccf-474e-b674-3546d174419d" (UID: "4b507554-0ccf-474e-b674-3546d174419d"). InnerVolumeSpecName "kube-api-access-25kbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:08:12.448272 master-0 kubenswrapper[9368]: I1203 20:08:12.448191 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25kbk\" (UniqueName: \"kubernetes.io/projected/4b507554-0ccf-474e-b674-3546d174419d-kube-api-access-25kbk\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:12.448272 master-0 kubenswrapper[9368]: I1203 20:08:12.448228 9368 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:12.475514 master-0 kubenswrapper[9368]: I1203 20:08:12.475379 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b507554-0ccf-474e-b674-3546d174419d" (UID: "4b507554-0ccf-474e-b674-3546d174419d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:08:12.544493 master-0 kubenswrapper[9368]: I1203 20:08:12.544436 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:08:12.544734 master-0 kubenswrapper[9368]: I1203 20:08:12.544516 9368 scope.go:117] "RemoveContainer" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" Dec 03 20:08:12.544836 master-0 kubenswrapper[9368]: E1203 20:08:12.544730 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:08:12.557319 master-0 kubenswrapper[9368]: I1203 20:08:12.557271 9368 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b507554-0ccf-474e-b674-3546d174419d-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:08:12.567619 master-0 kubenswrapper[9368]: I1203 20:08:12.567560 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48dfa48e-caea-4017-bd3e-d1da8bcd2da7" path="/var/lib/kubelet/pods/48dfa48e-caea-4017-bd3e-d1da8bcd2da7/volumes" Dec 03 20:08:12.569479 master-0 kubenswrapper[9368]: I1203 20:08:12.569440 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" path="/var/lib/kubelet/pods/af6f6483-5ca1-48b7-90b5-b03d460d041a/volumes" Dec 03 20:08:12.726316 master-0 kubenswrapper[9368]: I1203 20:08:12.726158 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnrx" event={"ID":"b638f207-31df-4298-8801-4da6031deefc","Type":"ContainerStarted","Data":"78ee178f7a003a7ea4da8e7af4ab70482ff1af0db0ba75e1e287f26f2dba23e1"} Dec 03 20:08:12.729138 master-0 kubenswrapper[9368]: I1203 20:08:12.729081 9368 generic.go:334] "Generic (PLEG): container finished" podID="4b507554-0ccf-474e-b674-3546d174419d" containerID="bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0" exitCode=0 Dec 03 20:08:12.729346 master-0 kubenswrapper[9368]: I1203 20:08:12.729162 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerDied","Data":"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0"} Dec 03 20:08:12.729346 master-0 kubenswrapper[9368]: I1203 20:08:12.729184 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnt24" event={"ID":"4b507554-0ccf-474e-b674-3546d174419d","Type":"ContainerDied","Data":"666ad51f38944e2835a47dc2eef12cdc4e1a1f7de33cf39c376607a4fac3d13f"} Dec 03 20:08:12.729346 master-0 kubenswrapper[9368]: I1203 20:08:12.729216 9368 scope.go:117] "RemoveContainer" containerID="bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0" Dec 03 20:08:12.729346 master-0 kubenswrapper[9368]: I1203 20:08:12.729226 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnt24" Dec 03 20:08:12.731067 master-0 kubenswrapper[9368]: I1203 20:08:12.731007 9368 generic.go:334] "Generic (PLEG): container finished" podID="a710102c-72fb-4d8d-ad99-71940368a09e" containerID="90e536e37d10c97618a40c363b7fe1c09180dc7a8bef1b5767ffc36ddc8dad7f" exitCode=0 Dec 03 20:08:12.731210 master-0 kubenswrapper[9368]: I1203 20:08:12.731085 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9smb5" event={"ID":"a710102c-72fb-4d8d-ad99-71940368a09e","Type":"ContainerDied","Data":"90e536e37d10c97618a40c363b7fe1c09180dc7a8bef1b5767ffc36ddc8dad7f"} Dec 03 20:08:12.731210 master-0 kubenswrapper[9368]: I1203 20:08:12.731115 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9smb5" event={"ID":"a710102c-72fb-4d8d-ad99-71940368a09e","Type":"ContainerStarted","Data":"28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87"} Dec 03 20:08:12.732691 master-0 kubenswrapper[9368]: I1203 20:08:12.732652 9368 generic.go:334] "Generic (PLEG): container finished" podID="6bb19329-c50c-4214-94c8-7e8771b99233" containerID="58ad9d8d299c84cf4870b5819091b740262ae4d0d8ffa65ef713656d5a0160a8" exitCode=0 Dec 03 20:08:12.732886 master-0 kubenswrapper[9368]: I1203 20:08:12.732704 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg96g" event={"ID":"6bb19329-c50c-4214-94c8-7e8771b99233","Type":"ContainerDied","Data":"58ad9d8d299c84cf4870b5819091b740262ae4d0d8ffa65ef713656d5a0160a8"} Dec 03 20:08:12.743944 master-0 kubenswrapper[9368]: I1203 20:08:12.743872 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98lh5" event={"ID":"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2","Type":"ContainerStarted","Data":"f6bfc08bb3cc90b54b4faf3a9c1d638cbb1f26ebec5cee39491ca7ba47115ce4"} Dec 03 20:08:12.753118 master-0 kubenswrapper[9368]: I1203 20:08:12.752434 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:08:12.753118 master-0 kubenswrapper[9368]: I1203 20:08:12.752993 9368 scope.go:117] "RemoveContainer" containerID="36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee" Dec 03 20:08:12.754362 master-0 kubenswrapper[9368]: I1203 20:08:12.754165 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcnrx" podStartSLOduration=2.2483689350000002 podStartE2EDuration="4.754133635s" podCreationTimestamp="2025-12-03 20:08:08 +0000 UTC" firstStartedPulling="2025-12-03 20:08:09.667552195 +0000 UTC m=+755.328802146" lastFinishedPulling="2025-12-03 20:08:12.173316915 +0000 UTC m=+757.834566846" observedRunningTime="2025-12-03 20:08:12.753327117 +0000 UTC m=+758.414577028" watchObservedRunningTime="2025-12-03 20:08:12.754133635 +0000 UTC m=+758.415383576" Dec 03 20:08:12.790548 master-0 kubenswrapper[9368]: I1203 20:08:12.789597 9368 scope.go:117] "RemoveContainer" containerID="bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0" Dec 03 20:08:12.806449 master-0 kubenswrapper[9368]: I1203 20:08:12.806337 9368 scope.go:117] "RemoveContainer" containerID="bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0" Dec 03 20:08:12.807925 master-0 kubenswrapper[9368]: E1203 20:08:12.807770 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0\": container with ID starting with bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0 not found: ID does not exist" containerID="bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0" Dec 03 20:08:12.808091 master-0 kubenswrapper[9368]: I1203 20:08:12.807878 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0"} err="failed to get container status \"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0\": rpc error: code = NotFound desc = could not find container \"bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0\": container with ID starting with bbed2ddc8b6c6909a13c6b5942eac929629ad96f47a0455b844d554893ca04d0 not found: ID does not exist" Dec 03 20:08:12.808091 master-0 kubenswrapper[9368]: I1203 20:08:12.808022 9368 scope.go:117] "RemoveContainer" containerID="36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee" Dec 03 20:08:12.808623 master-0 kubenswrapper[9368]: E1203 20:08:12.808580 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee\": container with ID starting with 36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee not found: ID does not exist" containerID="36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee" Dec 03 20:08:12.808706 master-0 kubenswrapper[9368]: I1203 20:08:12.808632 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee"} err="failed to get container status \"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee\": rpc error: code = NotFound desc = could not find container \"36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee\": container with ID starting with 36666af06ebcdb898153a85528e3078d2307f8942e516a3be9b07eca917081ee not found: ID does not exist" Dec 03 20:08:12.808706 master-0 kubenswrapper[9368]: I1203 20:08:12.808666 9368 scope.go:117] "RemoveContainer" containerID="bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0" Dec 03 20:08:12.808957 master-0 kubenswrapper[9368]: E1203 20:08:12.808930 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0\": container with ID starting with bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0 not found: ID does not exist" containerID="bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0" Dec 03 20:08:12.809042 master-0 kubenswrapper[9368]: I1203 20:08:12.808958 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0"} err="failed to get container status \"bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0\": rpc error: code = NotFound desc = could not find container \"bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0\": container with ID starting with bb593d1209c849fffed27a1d8bb320795bc79053be60df59d7acbd932edfd4d0 not found: ID does not exist" Dec 03 20:08:12.809091 master-0 kubenswrapper[9368]: I1203 20:08:12.809018 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98lh5" podStartSLOduration=2.347538862 podStartE2EDuration="4.809003195s" podCreationTimestamp="2025-12-03 20:08:08 +0000 UTC" firstStartedPulling="2025-12-03 20:08:09.664983636 +0000 UTC m=+755.326233577" lastFinishedPulling="2025-12-03 20:08:12.126447989 +0000 UTC m=+757.787697910" observedRunningTime="2025-12-03 20:08:12.805988925 +0000 UTC m=+758.467238846" watchObservedRunningTime="2025-12-03 20:08:12.809003195 +0000 UTC m=+758.470253126" Dec 03 20:08:12.827664 master-0 kubenswrapper[9368]: I1203 20:08:12.827597 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:08:12.830519 master-0 kubenswrapper[9368]: I1203 20:08:12.830477 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vnt24"] Dec 03 20:08:13.761942 master-0 kubenswrapper[9368]: I1203 20:08:13.761760 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg96g" event={"ID":"6bb19329-c50c-4214-94c8-7e8771b99233","Type":"ContainerStarted","Data":"9812bd4619e32237fc95f02e392f53708fc733215336153143b3521252c1a882"} Dec 03 20:08:13.766551 master-0 kubenswrapper[9368]: I1203 20:08:13.765406 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/4.log" Dec 03 20:08:13.766551 master-0 kubenswrapper[9368]: I1203 20:08:13.765557 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" event={"ID":"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3","Type":"ContainerStarted","Data":"714e1c5f497c698e9ca899d786aa0c0f0f3b92ae6d1c4728cb0e04fac8e3dc32"} Dec 03 20:08:13.772843 master-0 kubenswrapper[9368]: I1203 20:08:13.772711 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9smb5" event={"ID":"a710102c-72fb-4d8d-ad99-71940368a09e","Type":"ContainerStarted","Data":"c62ca34d47648391f608e6f2fa80f298167bf660cde830ab9846e95ff4484b7f"} Dec 03 20:08:13.795043 master-0 kubenswrapper[9368]: I1203 20:08:13.794947 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mg96g" podStartSLOduration=2.034501919 podStartE2EDuration="3.794923183s" podCreationTimestamp="2025-12-03 20:08:10 +0000 UTC" firstStartedPulling="2025-12-03 20:08:11.706398418 +0000 UTC m=+757.367648359" lastFinishedPulling="2025-12-03 20:08:13.466819702 +0000 UTC m=+759.128069623" observedRunningTime="2025-12-03 20:08:13.791579035 +0000 UTC m=+759.452828976" watchObservedRunningTime="2025-12-03 20:08:13.794923183 +0000 UTC m=+759.456173104" Dec 03 20:08:14.557856 master-0 kubenswrapper[9368]: I1203 20:08:14.557613 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b507554-0ccf-474e-b674-3546d174419d" path="/var/lib/kubelet/pods/4b507554-0ccf-474e-b674-3546d174419d/volumes" Dec 03 20:08:14.784811 master-0 kubenswrapper[9368]: I1203 20:08:14.784695 9368 generic.go:334] "Generic (PLEG): container finished" podID="a710102c-72fb-4d8d-ad99-71940368a09e" containerID="c62ca34d47648391f608e6f2fa80f298167bf660cde830ab9846e95ff4484b7f" exitCode=0 Dec 03 20:08:14.785596 master-0 kubenswrapper[9368]: I1203 20:08:14.785062 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9smb5" event={"ID":"a710102c-72fb-4d8d-ad99-71940368a09e","Type":"ContainerDied","Data":"c62ca34d47648391f608e6f2fa80f298167bf660cde830ab9846e95ff4484b7f"} Dec 03 20:08:15.544114 master-0 kubenswrapper[9368]: I1203 20:08:15.543984 9368 scope.go:117] "RemoveContainer" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" Dec 03 20:08:15.544270 master-0 kubenswrapper[9368]: I1203 20:08:15.544213 9368 scope.go:117] "RemoveContainer" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" Dec 03 20:08:15.797191 master-0 kubenswrapper[9368]: I1203 20:08:15.797094 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/4.log" Dec 03 20:08:15.797624 master-0 kubenswrapper[9368]: I1203 20:08:15.797225 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" event={"ID":"01d51d9a-9beb-4357-9dc2-aeac210cd0c4","Type":"ContainerStarted","Data":"af893f16ed5d4d40977e9a39daa52a6d5b7f8c4d20d0d485f7de0b422499040e"} Dec 03 20:08:15.801620 master-0 kubenswrapper[9368]: I1203 20:08:15.801168 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/4.log" Dec 03 20:08:15.801620 master-0 kubenswrapper[9368]: I1203 20:08:15.801340 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" event={"ID":"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f","Type":"ContainerStarted","Data":"7553b6d3497e2495cea97d2bf9f5c9c1cb8a7f933b06dd435f9dab34de0aeebd"} Dec 03 20:08:15.810240 master-0 kubenswrapper[9368]: I1203 20:08:15.810179 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9smb5" event={"ID":"a710102c-72fb-4d8d-ad99-71940368a09e","Type":"ContainerStarted","Data":"344eb085f94f4cf32b17a8de1433f7098f011bd17d6561805e4cce3f3072139f"} Dec 03 20:08:15.881439 master-0 kubenswrapper[9368]: I1203 20:08:15.881294 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9smb5" podStartSLOduration=2.435865782 podStartE2EDuration="4.881269667s" podCreationTimestamp="2025-12-03 20:08:11 +0000 UTC" firstStartedPulling="2025-12-03 20:08:12.732674533 +0000 UTC m=+758.393924484" lastFinishedPulling="2025-12-03 20:08:15.178078458 +0000 UTC m=+760.839328369" observedRunningTime="2025-12-03 20:08:15.858191276 +0000 UTC m=+761.519441227" watchObservedRunningTime="2025-12-03 20:08:15.881269667 +0000 UTC m=+761.542519578" Dec 03 20:08:17.543926 master-0 kubenswrapper[9368]: I1203 20:08:17.543861 9368 scope.go:117] "RemoveContainer" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" Dec 03 20:08:17.544492 master-0 kubenswrapper[9368]: I1203 20:08:17.543941 9368 scope.go:117] "RemoveContainer" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" Dec 03 20:08:18.452279 master-0 kubenswrapper[9368]: I1203 20:08:18.452161 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:18.452279 master-0 kubenswrapper[9368]: I1203 20:08:18.452240 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:18.522073 master-0 kubenswrapper[9368]: I1203 20:08:18.522027 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:18.544903 master-0 kubenswrapper[9368]: I1203 20:08:18.544845 9368 scope.go:117] "RemoveContainer" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" Dec 03 20:08:18.614869 master-0 kubenswrapper[9368]: I1203 20:08:18.614756 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:08:18.643352 master-0 kubenswrapper[9368]: I1203 20:08:18.643306 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:18.643352 master-0 kubenswrapper[9368]: I1203 20:08:18.643347 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:18.681443 master-0 kubenswrapper[9368]: I1203 20:08:18.681397 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:18.835588 master-0 kubenswrapper[9368]: I1203 20:08:18.835499 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/4.log" Dec 03 20:08:18.835936 master-0 kubenswrapper[9368]: I1203 20:08:18.835615 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" event={"ID":"11e2c94f-f9e9-415b-a550-3006a4632ba4","Type":"ContainerStarted","Data":"0be999ad75aff73c6192f6fbcb1d640ad9e8d758635098b612ae73c57a9495e3"} Dec 03 20:08:18.838018 master-0 kubenswrapper[9368]: I1203 20:08:18.837978 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/4.log" Dec 03 20:08:18.838229 master-0 kubenswrapper[9368]: I1203 20:08:18.838050 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" event={"ID":"6eb4700c-6af0-468b-afc8-1e09b902d6bf","Type":"ContainerStarted","Data":"934c2255e90f67747611997e95b034e1b62239ae3a26430463be115a4447a11b"} Dec 03 20:08:18.883823 master-0 kubenswrapper[9368]: I1203 20:08:18.883734 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:08:18.888609 master-0 kubenswrapper[9368]: I1203 20:08:18.888569 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:08:19.544191 master-0 kubenswrapper[9368]: I1203 20:08:19.544068 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:08:19.544344 master-0 kubenswrapper[9368]: E1203 20:08:19.544253 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:08:19.844921 master-0 kubenswrapper[9368]: I1203 20:08:19.844761 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/4.log" Dec 03 20:08:19.845405 master-0 kubenswrapper[9368]: I1203 20:08:19.844898 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" event={"ID":"943feb0d-7d31-446a-9100-dfc4ef013d12","Type":"ContainerStarted","Data":"30d0da03ee244895af8553014d4d31af80b6ad1caf25d5169adc2c0bc47b666d"} Dec 03 20:08:21.086372 master-0 kubenswrapper[9368]: I1203 20:08:21.086268 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:21.086372 master-0 kubenswrapper[9368]: I1203 20:08:21.086372 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:21.149002 master-0 kubenswrapper[9368]: I1203 20:08:21.148940 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:21.447911 master-0 kubenswrapper[9368]: I1203 20:08:21.447849 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:21.447911 master-0 kubenswrapper[9368]: I1203 20:08:21.447910 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:21.894213 master-0 kubenswrapper[9368]: I1203 20:08:21.894174 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:08:22.503661 master-0 kubenswrapper[9368]: I1203 20:08:22.503606 9368 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9smb5" podUID="a710102c-72fb-4d8d-ad99-71940368a09e" containerName="registry-server" probeResult="failure" output=< Dec 03 20:08:22.503661 master-0 kubenswrapper[9368]: timeout: failed to connect service ":50051" within 1s Dec 03 20:08:22.503661 master-0 kubenswrapper[9368]: > Dec 03 20:08:26.544139 master-0 kubenswrapper[9368]: I1203 20:08:26.544076 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:08:26.544924 master-0 kubenswrapper[9368]: E1203 20:08:26.544412 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:08:27.338266 master-0 kubenswrapper[9368]: I1203 20:08:27.338175 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:08:27.338516 master-0 kubenswrapper[9368]: I1203 20:08:27.338272 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:08:28.522947 master-0 kubenswrapper[9368]: I1203 20:08:28.522853 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc"] Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523170 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="extract-utilities" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523191 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="extract-utilities" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523213 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="registry-server" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523221 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="registry-server" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523236 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="extract-content" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523244 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="extract-content" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523257 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="registry-server" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523265 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="registry-server" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523278 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="extract-utilities" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523286 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="extract-utilities" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: E1203 20:08:28.523300 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="extract-content" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523308 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="extract-content" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523438 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6f6483-5ca1-48b7-90b5-b03d460d041a" containerName="registry-server" Dec 03 20:08:28.523969 master-0 kubenswrapper[9368]: I1203 20:08:28.523477 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b507554-0ccf-474e-b674-3546d174419d" containerName="registry-server" Dec 03 20:08:28.525125 master-0 kubenswrapper[9368]: I1203 20:08:28.524171 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.525773 master-0 kubenswrapper[9368]: I1203 20:08:28.525702 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-ztlqb" Dec 03 20:08:28.541581 master-0 kubenswrapper[9368]: I1203 20:08:28.541526 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc"] Dec 03 20:08:28.686357 master-0 kubenswrapper[9368]: I1203 20:08:28.686276 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsr6\" (UniqueName: \"kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.686582 master-0 kubenswrapper[9368]: I1203 20:08:28.686375 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.788464 master-0 kubenswrapper[9368]: I1203 20:08:28.788316 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsr6\" (UniqueName: \"kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.788464 master-0 kubenswrapper[9368]: I1203 20:08:28.788413 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.792037 master-0 kubenswrapper[9368]: I1203 20:08:28.791962 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.808630 master-0 kubenswrapper[9368]: I1203 20:08:28.808582 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsr6\" (UniqueName: \"kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:28.852189 master-0 kubenswrapper[9368]: I1203 20:08:28.852106 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:08:29.287444 master-0 kubenswrapper[9368]: I1203 20:08:29.287383 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc"] Dec 03 20:08:29.299445 master-0 kubenswrapper[9368]: W1203 20:08:29.299405 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3afc439_ccaa_4751_95a1_ac7557e326f0.slice/crio-2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473 WatchSource:0}: Error finding container 2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473: Status 404 returned error can't find the container with id 2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473 Dec 03 20:08:29.913942 master-0 kubenswrapper[9368]: I1203 20:08:29.913893 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" event={"ID":"c3afc439-ccaa-4751-95a1-ac7557e326f0","Type":"ContainerStarted","Data":"48f71d17f475204e5dab036d54287926e93c63e47aaa99f95313c26c12445d94"} Dec 03 20:08:29.913942 master-0 kubenswrapper[9368]: I1203 20:08:29.913942 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" event={"ID":"c3afc439-ccaa-4751-95a1-ac7557e326f0","Type":"ContainerStarted","Data":"2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473"} Dec 03 20:08:30.927588 master-0 kubenswrapper[9368]: I1203 20:08:30.927488 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" event={"ID":"c3afc439-ccaa-4751-95a1-ac7557e326f0","Type":"ContainerStarted","Data":"f4ac2359b83873d731069d2cdb007d5b0f4807bb88b794b924d440726fcb9ec5"} Dec 03 20:08:30.956756 master-0 kubenswrapper[9368]: I1203 20:08:30.956653 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" podStartSLOduration=2.956628803 podStartE2EDuration="2.956628803s" podCreationTimestamp="2025-12-03 20:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:08:30.953951302 +0000 UTC m=+776.615201283" watchObservedRunningTime="2025-12-03 20:08:30.956628803 +0000 UTC m=+776.617878744" Dec 03 20:08:31.008555 master-0 kubenswrapper[9368]: I1203 20:08:31.008458 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 20:08:31.008990 master-0 kubenswrapper[9368]: I1203 20:08:31.008917 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="multus-admission-controller" containerID="cri-o://ccaa5bcc074786e1602c431f92bcbfc1662e1c5b23f45ded5617110476671e11" gracePeriod=30 Dec 03 20:08:31.009607 master-0 kubenswrapper[9368]: I1203 20:08:31.009535 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="kube-rbac-proxy" containerID="cri-o://77dceba290fd067cd611c6d2a5e4c623247f11076c1771bf8dc8e4af20aaef57" gracePeriod=30 Dec 03 20:08:31.520066 master-0 kubenswrapper[9368]: I1203 20:08:31.519989 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:31.584428 master-0 kubenswrapper[9368]: I1203 20:08:31.584343 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:08:31.940540 master-0 kubenswrapper[9368]: I1203 20:08:31.940444 9368 generic.go:334] "Generic (PLEG): container finished" podID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerID="77dceba290fd067cd611c6d2a5e4c623247f11076c1771bf8dc8e4af20aaef57" exitCode=0 Dec 03 20:08:31.941504 master-0 kubenswrapper[9368]: I1203 20:08:31.940535 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerDied","Data":"77dceba290fd067cd611c6d2a5e4c623247f11076c1771bf8dc8e4af20aaef57"} Dec 03 20:08:32.025210 master-0 kubenswrapper[9368]: I1203 20:08:32.025071 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Dec 03 20:08:32.026572 master-0 kubenswrapper[9368]: I1203 20:08:32.026505 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.030624 master-0 kubenswrapper[9368]: I1203 20:08:32.030493 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 20:08:32.030914 master-0 kubenswrapper[9368]: I1203 20:08:32.030817 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nmjr4" Dec 03 20:08:32.042000 master-0 kubenswrapper[9368]: I1203 20:08:32.041903 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Dec 03 20:08:32.044555 master-0 kubenswrapper[9368]: I1203 20:08:32.044440 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.044768 master-0 kubenswrapper[9368]: I1203 20:08:32.044633 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.044960 master-0 kubenswrapper[9368]: I1203 20:08:32.044898 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.145811 master-0 kubenswrapper[9368]: I1203 20:08:32.145731 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.145811 master-0 kubenswrapper[9368]: I1203 20:08:32.145820 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.146070 master-0 kubenswrapper[9368]: I1203 20:08:32.145848 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.146070 master-0 kubenswrapper[9368]: I1203 20:08:32.145941 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.146166 master-0 kubenswrapper[9368]: I1203 20:08:32.146027 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.176049 master-0 kubenswrapper[9368]: I1203 20:08:32.175968 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.368069 master-0 kubenswrapper[9368]: I1203 20:08:32.367963 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:08:32.817925 master-0 kubenswrapper[9368]: I1203 20:08:32.817852 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Dec 03 20:08:32.821124 master-0 kubenswrapper[9368]: W1203 20:08:32.821054 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6d7367df_4046_4972_abc2_f07eade0ac6b.slice/crio-c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13 WatchSource:0}: Error finding container c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13: Status 404 returned error can't find the container with id c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13 Dec 03 20:08:32.956544 master-0 kubenswrapper[9368]: I1203 20:08:32.956461 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6d7367df-4046-4972-abc2-f07eade0ac6b","Type":"ContainerStarted","Data":"c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13"} Dec 03 20:08:33.544565 master-0 kubenswrapper[9368]: I1203 20:08:33.544479 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:08:33.544948 master-0 kubenswrapper[9368]: E1203 20:08:33.544896 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:08:33.971563 master-0 kubenswrapper[9368]: I1203 20:08:33.971427 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6d7367df-4046-4972-abc2-f07eade0ac6b","Type":"ContainerStarted","Data":"9e705cfbdf86095324ded574be9e84d30f2d828c4c08426be6a6b1ed1158bdf8"} Dec 03 20:08:34.007862 master-0 kubenswrapper[9368]: I1203 20:08:34.006940 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" podStartSLOduration=2.00691094 podStartE2EDuration="2.00691094s" podCreationTimestamp="2025-12-03 20:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:08:34.001767643 +0000 UTC m=+779.663017624" watchObservedRunningTime="2025-12-03 20:08:34.00691094 +0000 UTC m=+779.668160891" Dec 03 20:08:37.544864 master-0 kubenswrapper[9368]: I1203 20:08:37.544589 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:08:37.545978 master-0 kubenswrapper[9368]: E1203 20:08:37.545074 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:08:40.618766 master-0 kubenswrapper[9368]: I1203 20:08:40.618692 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 03 20:08:40.619861 master-0 kubenswrapper[9368]: I1203 20:08:40.619528 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.622824 master-0 kubenswrapper[9368]: I1203 20:08:40.622742 9368 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 20:08:40.623375 master-0 kubenswrapper[9368]: I1203 20:08:40.623319 9368 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-n2brl" Dec 03 20:08:40.638679 master-0 kubenswrapper[9368]: I1203 20:08:40.638622 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 03 20:08:40.708920 master-0 kubenswrapper[9368]: I1203 20:08:40.708838 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.708920 master-0 kubenswrapper[9368]: I1203 20:08:40.708905 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.709179 master-0 kubenswrapper[9368]: I1203 20:08:40.709055 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.810915 master-0 kubenswrapper[9368]: I1203 20:08:40.810838 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.811131 master-0 kubenswrapper[9368]: I1203 20:08:40.810924 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.811131 master-0 kubenswrapper[9368]: I1203 20:08:40.810960 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.811131 master-0 kubenswrapper[9368]: I1203 20:08:40.810976 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.811131 master-0 kubenswrapper[9368]: I1203 20:08:40.811073 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.828649 master-0 kubenswrapper[9368]: I1203 20:08:40.828572 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:40.938124 master-0 kubenswrapper[9368]: I1203 20:08:40.937988 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:08:41.333453 master-0 kubenswrapper[9368]: I1203 20:08:41.333337 9368 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 03 20:08:41.342133 master-0 kubenswrapper[9368]: W1203 20:08:41.342088 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode73e6013_87fc_40e2_a573_39930828faa7.slice/crio-458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614 WatchSource:0}: Error finding container 458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614: Status 404 returned error can't find the container with id 458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614 Dec 03 20:08:42.038987 master-0 kubenswrapper[9368]: I1203 20:08:42.038879 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"e73e6013-87fc-40e2-a573-39930828faa7","Type":"ContainerStarted","Data":"a73aec75e1cef31c969c506854b5ac02887023e7a9ddf7c907ea711f21b91d25"} Dec 03 20:08:42.038987 master-0 kubenswrapper[9368]: I1203 20:08:42.038969 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"e73e6013-87fc-40e2-a573-39930828faa7","Type":"ContainerStarted","Data":"458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614"} Dec 03 20:08:42.062490 master-0 kubenswrapper[9368]: I1203 20:08:42.062367 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=2.062338082 podStartE2EDuration="2.062338082s" podCreationTimestamp="2025-12-03 20:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:08:42.060699554 +0000 UTC m=+787.721949475" watchObservedRunningTime="2025-12-03 20:08:42.062338082 +0000 UTC m=+787.723588053" Dec 03 20:08:46.544581 master-0 kubenswrapper[9368]: I1203 20:08:46.544505 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:08:46.545494 master-0 kubenswrapper[9368]: E1203 20:08:46.544923 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-8xmrv_openshift-config-operator(0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" podUID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" Dec 03 20:08:49.544464 master-0 kubenswrapper[9368]: I1203 20:08:49.544378 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:08:49.545051 master-0 kubenswrapper[9368]: E1203 20:08:49.544623 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:08:57.338926 master-0 kubenswrapper[9368]: I1203 20:08:57.337502 9368 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:08:57.338926 master-0 kubenswrapper[9368]: I1203 20:08:57.337579 9368 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:08:58.545359 master-0 kubenswrapper[9368]: I1203 20:08:58.545297 9368 scope.go:117] "RemoveContainer" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" Dec 03 20:08:59.180863 master-0 kubenswrapper[9368]: I1203 20:08:59.180743 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/4.log" Dec 03 20:08:59.181335 master-0 kubenswrapper[9368]: I1203 20:08:59.181301 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" event={"ID":"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9","Type":"ContainerStarted","Data":"da72374f57e50d2b439edf03ff32455cbf2dd16e43251073b0077b9e6bf09574"} Dec 03 20:08:59.181646 master-0 kubenswrapper[9368]: I1203 20:08:59.181613 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:00.544759 master-0 kubenswrapper[9368]: I1203 20:09:00.544694 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:09:00.545765 master-0 kubenswrapper[9368]: E1203 20:09:00.545061 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:09:01.200305 master-0 kubenswrapper[9368]: I1203 20:09:01.200243 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-nqn2j_b4316c8d-a1d3-4e51-83cc-d0eecb809924/multus-admission-controller/0.log" Dec 03 20:09:01.200503 master-0 kubenswrapper[9368]: I1203 20:09:01.200340 9368 generic.go:334] "Generic (PLEG): container finished" podID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerID="ccaa5bcc074786e1602c431f92bcbfc1662e1c5b23f45ded5617110476671e11" exitCode=137 Dec 03 20:09:01.200503 master-0 kubenswrapper[9368]: I1203 20:09:01.200398 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerDied","Data":"ccaa5bcc074786e1602c431f92bcbfc1662e1c5b23f45ded5617110476671e11"} Dec 03 20:09:01.852556 master-0 kubenswrapper[9368]: I1203 20:09:01.852491 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-nqn2j_b4316c8d-a1d3-4e51-83cc-d0eecb809924/multus-admission-controller/0.log" Dec 03 20:09:01.853110 master-0 kubenswrapper[9368]: I1203 20:09:01.852606 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 20:09:01.858101 master-0 kubenswrapper[9368]: I1203 20:09:01.858059 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:01.932327 master-0 kubenswrapper[9368]: I1203 20:09:01.925810 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") pod \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " Dec 03 20:09:01.932327 master-0 kubenswrapper[9368]: I1203 20:09:01.925922 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") pod \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\" (UID: \"b4316c8d-a1d3-4e51-83cc-d0eecb809924\") " Dec 03 20:09:01.934638 master-0 kubenswrapper[9368]: I1203 20:09:01.933014 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx" (OuterVolumeSpecName: "kube-api-access-74dvx") pod "b4316c8d-a1d3-4e51-83cc-d0eecb809924" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924"). InnerVolumeSpecName "kube-api-access-74dvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:09:01.934638 master-0 kubenswrapper[9368]: I1203 20:09:01.933919 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "b4316c8d-a1d3-4e51-83cc-d0eecb809924" (UID: "b4316c8d-a1d3-4e51-83cc-d0eecb809924"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:09:02.027570 master-0 kubenswrapper[9368]: I1203 20:09:02.027502 9368 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4316c8d-a1d3-4e51-83cc-d0eecb809924-webhook-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:02.027570 master-0 kubenswrapper[9368]: I1203 20:09:02.027555 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74dvx\" (UniqueName: \"kubernetes.io/projected/b4316c8d-a1d3-4e51-83cc-d0eecb809924-kube-api-access-74dvx\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:02.211635 master-0 kubenswrapper[9368]: I1203 20:09:02.211562 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-nqn2j_b4316c8d-a1d3-4e51-83cc-d0eecb809924/multus-admission-controller/0.log" Dec 03 20:09:02.211907 master-0 kubenswrapper[9368]: I1203 20:09:02.211649 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" event={"ID":"b4316c8d-a1d3-4e51-83cc-d0eecb809924","Type":"ContainerDied","Data":"42e1b375dcaebdf8d6351192223452d8b91294cb866b8e2a93c4bc9df5e70f90"} Dec 03 20:09:02.211907 master-0 kubenswrapper[9368]: I1203 20:09:02.211699 9368 scope.go:117] "RemoveContainer" containerID="77dceba290fd067cd611c6d2a5e4c623247f11076c1771bf8dc8e4af20aaef57" Dec 03 20:09:02.211907 master-0 kubenswrapper[9368]: I1203 20:09:02.211742 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j" Dec 03 20:09:02.237747 master-0 kubenswrapper[9368]: I1203 20:09:02.237664 9368 scope.go:117] "RemoveContainer" containerID="ccaa5bcc074786e1602c431f92bcbfc1662e1c5b23f45ded5617110476671e11" Dec 03 20:09:02.267825 master-0 kubenswrapper[9368]: I1203 20:09:02.267668 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 20:09:02.273728 master-0 kubenswrapper[9368]: I1203 20:09:02.273651 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-nqn2j"] Dec 03 20:09:02.555230 master-0 kubenswrapper[9368]: I1203 20:09:02.555068 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" path="/var/lib/kubelet/pods/b4316c8d-a1d3-4e51-83cc-d0eecb809924/volumes" Dec 03 20:09:06.271328 master-0 kubenswrapper[9368]: I1203 20:09:06.271269 9368 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 20:09:06.272843 master-0 kubenswrapper[9368]: I1203 20:09:06.272311 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:09:06.272843 master-0 kubenswrapper[9368]: I1203 20:09:06.272588 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" containerID="cri-o://73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" gracePeriod=30 Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.272691 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.272873 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.272891 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.272900 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.272916 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.272924 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.272926 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" containerID="cri-o://9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" gracePeriod=30 Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.272936 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.273017 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.273041 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.273049 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.273143 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="kube-rbac-proxy" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: I1203 20:09:06.273155 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="kube-rbac-proxy" Dec 03 20:09:06.273155 master-0 kubenswrapper[9368]: E1203 20:09:06.273181 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273189 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: E1203 20:09:06.273204 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="multus-admission-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273211 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="multus-admission-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: E1203 20:09:06.273242 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273249 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: E1203 20:09:06.273266 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273272 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273535 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="multus-admission-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273555 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273572 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273581 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273602 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273610 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4316c8d-a1d3-4e51-83cc-d0eecb809924" containerName="kube-rbac-proxy" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273620 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273628 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273636 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273648 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: E1203 20:09:06.273767 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273799 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: E1203 20:09:06.273814 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273820 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 20:09:06.274020 master-0 kubenswrapper[9368]: I1203 20:09:06.273920 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.276001 master-0 kubenswrapper[9368]: I1203 20:09:06.274135 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 20:09:06.276001 master-0 kubenswrapper[9368]: I1203 20:09:06.274842 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.324458 master-0 kubenswrapper[9368]: I1203 20:09:06.324377 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:09:06.395550 master-0 kubenswrapper[9368]: I1203 20:09:06.395488 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.395805 master-0 kubenswrapper[9368]: I1203 20:09:06.395726 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.477548 master-0 kubenswrapper[9368]: I1203 20:09:06.477504 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:09:06.497423 master-0 kubenswrapper[9368]: I1203 20:09:06.497350 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.497610 master-0 kubenswrapper[9368]: I1203 20:09:06.497443 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.497610 master-0 kubenswrapper[9368]: I1203 20:09:06.497474 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.497695 master-0 kubenswrapper[9368]: I1203 20:09:06.497617 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.507709 master-0 kubenswrapper[9368]: I1203 20:09:06.507636 9368 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="b12f2117-ffd2-4818-9b77-bba54560d6ab" Dec 03 20:09:06.556069 master-0 kubenswrapper[9368]: I1203 20:09:06.555896 9368 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Dec 03 20:09:06.577989 master-0 kubenswrapper[9368]: I1203 20:09:06.577890 9368 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 20:09:06.577989 master-0 kubenswrapper[9368]: I1203 20:09:06.577942 9368 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="b12f2117-ffd2-4818-9b77-bba54560d6ab" Dec 03 20:09:06.581680 master-0 kubenswrapper[9368]: I1203 20:09:06.581603 9368 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 20:09:06.581680 master-0 kubenswrapper[9368]: I1203 20:09:06.581648 9368 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="b12f2117-ffd2-4818-9b77-bba54560d6ab" Dec 03 20:09:06.598572 master-0 kubenswrapper[9368]: I1203 20:09:06.598437 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 20:09:06.598572 master-0 kubenswrapper[9368]: I1203 20:09:06.598612 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598648 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config" (OuterVolumeSpecName: "config") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598692 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598797 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598841 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets" (OuterVolumeSpecName: "secrets") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598843 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598898 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 20:09:06.598994 master-0 kubenswrapper[9368]: I1203 20:09:06.598961 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs" (OuterVolumeSpecName: "logs") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:06.599564 master-0 kubenswrapper[9368]: I1203 20:09:06.599057 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:06.599954 master-0 kubenswrapper[9368]: I1203 20:09:06.599913 9368 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:06.599954 master-0 kubenswrapper[9368]: I1203 20:09:06.599943 9368 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:06.599954 master-0 kubenswrapper[9368]: I1203 20:09:06.599960 9368 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:06.600172 master-0 kubenswrapper[9368]: I1203 20:09:06.599974 9368 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:06.600172 master-0 kubenswrapper[9368]: I1203 20:09:06.599986 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:06.623759 master-0 kubenswrapper[9368]: I1203 20:09:06.623651 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:06.645802 master-0 kubenswrapper[9368]: W1203 20:09:06.645723 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc347c4e75ec09c3a7fea6a3ba3ee63c.slice/crio-d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272 WatchSource:0}: Error finding container d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272: Status 404 returned error can't find the container with id d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272 Dec 03 20:09:07.260771 master-0 kubenswrapper[9368]: I1203 20:09:07.260673 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" exitCode=0 Dec 03 20:09:07.260771 master-0 kubenswrapper[9368]: I1203 20:09:07.260711 9368 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" exitCode=0 Dec 03 20:09:07.260771 master-0 kubenswrapper[9368]: I1203 20:09:07.260728 9368 scope.go:117] "RemoveContainer" containerID="73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" Dec 03 20:09:07.260771 master-0 kubenswrapper[9368]: I1203 20:09:07.260761 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 20:09:07.265504 master-0 kubenswrapper[9368]: I1203 20:09:07.265429 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} Dec 03 20:09:07.265504 master-0 kubenswrapper[9368]: I1203 20:09:07.265499 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} Dec 03 20:09:07.265728 master-0 kubenswrapper[9368]: I1203 20:09:07.265521 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272"} Dec 03 20:09:07.268569 master-0 kubenswrapper[9368]: I1203 20:09:07.268491 9368 generic.go:334] "Generic (PLEG): container finished" podID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerID="9e705cfbdf86095324ded574be9e84d30f2d828c4c08426be6a6b1ed1158bdf8" exitCode=0 Dec 03 20:09:07.268569 master-0 kubenswrapper[9368]: I1203 20:09:07.268540 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6d7367df-4046-4972-abc2-f07eade0ac6b","Type":"ContainerDied","Data":"9e705cfbdf86095324ded574be9e84d30f2d828c4c08426be6a6b1ed1158bdf8"} Dec 03 20:09:07.283955 master-0 kubenswrapper[9368]: I1203 20:09:07.283660 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:09:07.307068 master-0 kubenswrapper[9368]: I1203 20:09:07.307018 9368 scope.go:117] "RemoveContainer" containerID="9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" Dec 03 20:09:07.329711 master-0 kubenswrapper[9368]: I1203 20:09:07.329659 9368 scope.go:117] "RemoveContainer" containerID="202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" Dec 03 20:09:07.348819 master-0 kubenswrapper[9368]: I1203 20:09:07.348759 9368 scope.go:117] "RemoveContainer" containerID="73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" Dec 03 20:09:07.349224 master-0 kubenswrapper[9368]: E1203 20:09:07.349189 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83\": container with ID starting with 73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83 not found: ID does not exist" containerID="73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" Dec 03 20:09:07.349326 master-0 kubenswrapper[9368]: I1203 20:09:07.349259 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83"} err="failed to get container status \"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83\": rpc error: code = NotFound desc = could not find container \"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83\": container with ID starting with 73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83 not found: ID does not exist" Dec 03 20:09:07.349326 master-0 kubenswrapper[9368]: I1203 20:09:07.349284 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:09:07.349865 master-0 kubenswrapper[9368]: E1203 20:09:07.349832 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967\": container with ID starting with 46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967 not found: ID does not exist" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:09:07.349985 master-0 kubenswrapper[9368]: I1203 20:09:07.349864 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967"} err="failed to get container status \"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967\": rpc error: code = NotFound desc = could not find container \"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967\": container with ID starting with 46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967 not found: ID does not exist" Dec 03 20:09:07.349985 master-0 kubenswrapper[9368]: I1203 20:09:07.349886 9368 scope.go:117] "RemoveContainer" containerID="9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" Dec 03 20:09:07.350606 master-0 kubenswrapper[9368]: E1203 20:09:07.350555 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c\": container with ID starting with 9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c not found: ID does not exist" containerID="9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" Dec 03 20:09:07.350699 master-0 kubenswrapper[9368]: I1203 20:09:07.350606 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c"} err="failed to get container status \"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c\": rpc error: code = NotFound desc = could not find container \"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c\": container with ID starting with 9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c not found: ID does not exist" Dec 03 20:09:07.350699 master-0 kubenswrapper[9368]: I1203 20:09:07.350635 9368 scope.go:117] "RemoveContainer" containerID="202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" Dec 03 20:09:07.351332 master-0 kubenswrapper[9368]: E1203 20:09:07.351292 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81\": container with ID starting with 202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81 not found: ID does not exist" containerID="202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" Dec 03 20:09:07.351332 master-0 kubenswrapper[9368]: I1203 20:09:07.351323 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81"} err="failed to get container status \"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81\": rpc error: code = NotFound desc = could not find container \"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81\": container with ID starting with 202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81 not found: ID does not exist" Dec 03 20:09:07.351524 master-0 kubenswrapper[9368]: I1203 20:09:07.351342 9368 scope.go:117] "RemoveContainer" containerID="73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83" Dec 03 20:09:07.351887 master-0 kubenswrapper[9368]: I1203 20:09:07.351835 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83"} err="failed to get container status \"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83\": rpc error: code = NotFound desc = could not find container \"73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83\": container with ID starting with 73af3993c24e82ebaeef170be65e78e6baadc7d344c100ebc05ff0759cbf9b83 not found: ID does not exist" Dec 03 20:09:07.351887 master-0 kubenswrapper[9368]: I1203 20:09:07.351869 9368 scope.go:117] "RemoveContainer" containerID="46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967" Dec 03 20:09:07.352258 master-0 kubenswrapper[9368]: I1203 20:09:07.352220 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967"} err="failed to get container status \"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967\": rpc error: code = NotFound desc = could not find container \"46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967\": container with ID starting with 46de1ad3bf6f682c496afa1dd6b042e363037a0e89fe44635c7a1ff9719b6967 not found: ID does not exist" Dec 03 20:09:07.352258 master-0 kubenswrapper[9368]: I1203 20:09:07.352249 9368 scope.go:117] "RemoveContainer" containerID="9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c" Dec 03 20:09:07.352658 master-0 kubenswrapper[9368]: I1203 20:09:07.352624 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c"} err="failed to get container status \"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c\": rpc error: code = NotFound desc = could not find container \"9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c\": container with ID starting with 9c9e87ce301fc8a45a8230a2fb6405ec79e3eaf7be312bbca4d4b9762860041c not found: ID does not exist" Dec 03 20:09:07.352658 master-0 kubenswrapper[9368]: I1203 20:09:07.352652 9368 scope.go:117] "RemoveContainer" containerID="202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81" Dec 03 20:09:07.353387 master-0 kubenswrapper[9368]: I1203 20:09:07.353322 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81"} err="failed to get container status \"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81\": rpc error: code = NotFound desc = could not find container \"202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81\": container with ID starting with 202f5c510c7dffcf3778f8c8cad285e6acbde2095d3e758d800e60d2aa080a81 not found: ID does not exist" Dec 03 20:09:08.280351 master-0 kubenswrapper[9368]: I1203 20:09:08.280303 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} Dec 03 20:09:08.280577 master-0 kubenswrapper[9368]: I1203 20:09:08.280357 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} Dec 03 20:09:08.325365 master-0 kubenswrapper[9368]: I1203 20:09:08.324684 9368 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.324662089 podStartE2EDuration="2.324662089s" podCreationTimestamp="2025-12-03 20:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:09:08.322525847 +0000 UTC m=+813.983775768" watchObservedRunningTime="2025-12-03 20:09:08.324662089 +0000 UTC m=+813.985912000" Dec 03 20:09:08.562359 master-0 kubenswrapper[9368]: I1203 20:09:08.562307 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bce50c457ac1f4721bc81a570dd238a" path="/var/lib/kubelet/pods/7bce50c457ac1f4721bc81a570dd238a/volumes" Dec 03 20:09:08.574294 master-0 kubenswrapper[9368]: I1203 20:09:08.574252 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:09:08.628584 master-0 kubenswrapper[9368]: I1203 20:09:08.628525 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock\") pod \"6d7367df-4046-4972-abc2-f07eade0ac6b\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " Dec 03 20:09:08.628815 master-0 kubenswrapper[9368]: I1203 20:09:08.628618 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access\") pod \"6d7367df-4046-4972-abc2-f07eade0ac6b\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " Dec 03 20:09:08.628815 master-0 kubenswrapper[9368]: I1203 20:09:08.628631 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock" (OuterVolumeSpecName: "var-lock") pod "6d7367df-4046-4972-abc2-f07eade0ac6b" (UID: "6d7367df-4046-4972-abc2-f07eade0ac6b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:08.628815 master-0 kubenswrapper[9368]: I1203 20:09:08.628666 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir\") pod \"6d7367df-4046-4972-abc2-f07eade0ac6b\" (UID: \"6d7367df-4046-4972-abc2-f07eade0ac6b\") " Dec 03 20:09:08.628954 master-0 kubenswrapper[9368]: I1203 20:09:08.628833 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d7367df-4046-4972-abc2-f07eade0ac6b" (UID: "6d7367df-4046-4972-abc2-f07eade0ac6b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:08.629089 master-0 kubenswrapper[9368]: I1203 20:09:08.629056 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:08.629089 master-0 kubenswrapper[9368]: I1203 20:09:08.629079 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d7367df-4046-4972-abc2-f07eade0ac6b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:08.631329 master-0 kubenswrapper[9368]: I1203 20:09:08.631291 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d7367df-4046-4972-abc2-f07eade0ac6b" (UID: "6d7367df-4046-4972-abc2-f07eade0ac6b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:09:08.729923 master-0 kubenswrapper[9368]: I1203 20:09:08.729849 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d7367df-4046-4972-abc2-f07eade0ac6b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:09.290857 master-0 kubenswrapper[9368]: I1203 20:09:09.290815 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:09:09.291415 master-0 kubenswrapper[9368]: I1203 20:09:09.290828 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6d7367df-4046-4972-abc2-f07eade0ac6b","Type":"ContainerDied","Data":"c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13"} Dec 03 20:09:09.291569 master-0 kubenswrapper[9368]: I1203 20:09:09.291519 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13" Dec 03 20:09:13.543624 master-0 kubenswrapper[9368]: I1203 20:09:13.543521 9368 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:09:13.544724 master-0 kubenswrapper[9368]: E1203 20:09:13.543725 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-mfwhz_openshift-authentication-operator(a185ee17-4b4b-4d20-a8ed-56a2a01f1807)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" podUID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" Dec 03 20:09:16.624295 master-0 kubenswrapper[9368]: I1203 20:09:16.624205 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:16.625260 master-0 kubenswrapper[9368]: I1203 20:09:16.624319 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:16.625260 master-0 kubenswrapper[9368]: I1203 20:09:16.624364 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:16.625260 master-0 kubenswrapper[9368]: I1203 20:09:16.624392 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:16.625260 master-0 kubenswrapper[9368]: I1203 20:09:16.624696 9368 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 03 20:09:16.625260 master-0 kubenswrapper[9368]: I1203 20:09:16.624809 9368 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 20:09:16.632229 master-0 kubenswrapper[9368]: I1203 20:09:16.632160 9368 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:17.355773 master-0 kubenswrapper[9368]: I1203 20:09:17.355727 9368 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:19.580228 master-0 kubenswrapper[9368]: I1203 20:09:19.580123 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:09:19.581199 master-0 kubenswrapper[9368]: E1203 20:09:19.580622 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:09:19.581199 master-0 kubenswrapper[9368]: I1203 20:09:19.580650 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:09:19.581199 master-0 kubenswrapper[9368]: I1203 20:09:19.580897 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:09:19.581482 master-0 kubenswrapper[9368]: I1203 20:09:19.581436 9368 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 20:09:19.581677 master-0 kubenswrapper[9368]: I1203 20:09:19.581609 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.581843 master-0 kubenswrapper[9368]: I1203 20:09:19.581774 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" containerID="cri-o://f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3" gracePeriod=15 Dec 03 20:09:19.582023 master-0 kubenswrapper[9368]: I1203 20:09:19.581952 9368 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67" gracePeriod=15 Dec 03 20:09:19.634374 master-0 kubenswrapper[9368]: I1203 20:09:19.634299 9368 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:09:19.672367 master-0 kubenswrapper[9368]: I1203 20:09:19.672323 9368 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:09:19.673289 master-0 kubenswrapper[9368]: E1203 20:09:19.673268 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 20:09:19.673424 master-0 kubenswrapper[9368]: I1203 20:09:19.673411 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 20:09:19.673556 master-0 kubenswrapper[9368]: E1203 20:09:19.673544 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 20:09:19.673643 master-0 kubenswrapper[9368]: I1203 20:09:19.673626 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 20:09:19.673948 master-0 kubenswrapper[9368]: E1203 20:09:19.673925 9368 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 20:09:19.674028 master-0 kubenswrapper[9368]: I1203 20:09:19.674018 9368 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 20:09:19.674193 master-0 kubenswrapper[9368]: I1203 20:09:19.674180 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 20:09:19.674279 master-0 kubenswrapper[9368]: I1203 20:09:19.674268 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 20:09:19.674352 master-0 kubenswrapper[9368]: I1203 20:09:19.674342 9368 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 20:09:19.675811 master-0 kubenswrapper[9368]: I1203 20:09:19.675792 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.678059 master-0 kubenswrapper[9368]: I1203 20:09:19.677986 9368 status_manager.go:851] "Failed to get status for pod" podUID="b0f7c518a656139710b17a7667c8b898" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:19.705101 master-0 kubenswrapper[9368]: I1203 20:09:19.705052 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.705345 master-0 kubenswrapper[9368]: I1203 20:09:19.705130 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.705345 master-0 kubenswrapper[9368]: I1203 20:09:19.705246 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.705584 master-0 kubenswrapper[9368]: I1203 20:09:19.705385 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.705584 master-0 kubenswrapper[9368]: I1203 20:09:19.705448 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.762494 master-0 kubenswrapper[9368]: E1203 20:09:19.762430 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.807417 master-0 kubenswrapper[9368]: I1203 20:09:19.807336 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.807417 master-0 kubenswrapper[9368]: I1203 20:09:19.807425 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807715 master-0 kubenswrapper[9368]: I1203 20:09:19.807458 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.807715 master-0 kubenswrapper[9368]: I1203 20:09:19.807507 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807715 master-0 kubenswrapper[9368]: I1203 20:09:19.807552 9368 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.807715 master-0 kubenswrapper[9368]: I1203 20:09:19.807553 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807715 master-0 kubenswrapper[9368]: I1203 20:09:19.807638 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807930 master-0 kubenswrapper[9368]: I1203 20:09:19.807729 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807930 master-0 kubenswrapper[9368]: I1203 20:09:19.807818 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807930 master-0 kubenswrapper[9368]: I1203 20:09:19.807882 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.807930 master-0 kubenswrapper[9368]: I1203 20:09:19.807890 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.808043 master-0 kubenswrapper[9368]: I1203 20:09:19.807939 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.808043 master-0 kubenswrapper[9368]: I1203 20:09:19.807990 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.910024 master-0 kubenswrapper[9368]: I1203 20:09:19.909879 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.910024 master-0 kubenswrapper[9368]: I1203 20:09:19.909982 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.910024 master-0 kubenswrapper[9368]: I1203 20:09:19.910018 9368 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.910534 master-0 kubenswrapper[9368]: I1203 20:09:19.910036 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.910534 master-0 kubenswrapper[9368]: I1203 20:09:19.910143 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.910534 master-0 kubenswrapper[9368]: I1203 20:09:19.910173 9368 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:19.925546 master-0 kubenswrapper[9368]: I1203 20:09:19.925484 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:19.958599 master-0 kubenswrapper[9368]: W1203 20:09:19.958500 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f7c518a656139710b17a7667c8b898.slice/crio-fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347 WatchSource:0}: Error finding container fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347: Status 404 returned error can't find the container with id fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347 Dec 03 20:09:19.962516 master-0 kubenswrapper[9368]: E1203 20:09:19.962399 9368 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.187dcd7ba6c37ece openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:b0f7c518a656139710b17a7667c8b898,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 20:09:19.96160379 +0000 UTC m=+825.622853691,LastTimestamp:2025-12-03 20:09:19.96160379 +0000 UTC m=+825.622853691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:09:20.064135 master-0 kubenswrapper[9368]: I1203 20:09:20.064013 9368 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:20.089938 master-0 kubenswrapper[9368]: W1203 20:09:20.089856 9368 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa3433149c0833909dd6c97d45272ed.slice/crio-1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98 WatchSource:0}: Error finding container 1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98: Status 404 returned error can't find the container with id 1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98 Dec 03 20:09:20.370458 master-0 kubenswrapper[9368]: I1203 20:09:20.370411 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b0f7c518a656139710b17a7667c8b898","Type":"ContainerStarted","Data":"fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347"} Dec 03 20:09:20.372487 master-0 kubenswrapper[9368]: I1203 20:09:20.372435 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98"} Dec 03 20:09:20.375520 master-0 kubenswrapper[9368]: I1203 20:09:20.375479 9368 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67" exitCode=0 Dec 03 20:09:20.378229 master-0 kubenswrapper[9368]: I1203 20:09:20.378196 9368 generic.go:334] "Generic (PLEG): container finished" podID="e73e6013-87fc-40e2-a573-39930828faa7" containerID="a73aec75e1cef31c969c506854b5ac02887023e7a9ddf7c907ea711f21b91d25" exitCode=0 Dec 03 20:09:20.378318 master-0 kubenswrapper[9368]: I1203 20:09:20.378242 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"e73e6013-87fc-40e2-a573-39930828faa7","Type":"ContainerDied","Data":"a73aec75e1cef31c969c506854b5ac02887023e7a9ddf7c907ea711f21b91d25"} Dec 03 20:09:20.380065 master-0 kubenswrapper[9368]: I1203 20:09:20.379979 9368 status_manager.go:851] "Failed to get status for pod" podUID="e73e6013-87fc-40e2-a573-39930828faa7" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:20.381096 master-0 kubenswrapper[9368]: I1203 20:09:20.381021 9368 status_manager.go:851] "Failed to get status for pod" podUID="b0f7c518a656139710b17a7667c8b898" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:21.390581 master-0 kubenswrapper[9368]: I1203 20:09:21.390471 9368 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35" exitCode=0 Dec 03 20:09:21.391565 master-0 kubenswrapper[9368]: I1203 20:09:21.390616 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerDied","Data":"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35"} Dec 03 20:09:21.391869 master-0 kubenswrapper[9368]: I1203 20:09:21.391724 9368 status_manager.go:851] "Failed to get status for pod" podUID="e73e6013-87fc-40e2-a573-39930828faa7" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:21.392519 master-0 kubenswrapper[9368]: I1203 20:09:21.392473 9368 status_manager.go:851] "Failed to get status for pod" podUID="b0f7c518a656139710b17a7667c8b898" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:21.393373 master-0 kubenswrapper[9368]: E1203 20:09:21.392767 9368 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:21.393373 master-0 kubenswrapper[9368]: I1203 20:09:21.392871 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b0f7c518a656139710b17a7667c8b898","Type":"ContainerStarted","Data":"10dd5e50757ca6d8fb428d9d41440e88b1cc3fce51685a0860bb2b0898ea0950"} Dec 03 20:09:21.395271 master-0 kubenswrapper[9368]: I1203 20:09:21.395168 9368 status_manager.go:851] "Failed to get status for pod" podUID="e73e6013-87fc-40e2-a573-39930828faa7" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:21.396311 master-0 kubenswrapper[9368]: I1203 20:09:21.396220 9368 status_manager.go:851] "Failed to get status for pod" podUID="b0f7c518a656139710b17a7667c8b898" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:09:21.923346 master-0 kubenswrapper[9368]: I1203 20:09:21.923291 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:21.928607 master-0 kubenswrapper[9368]: I1203 20:09:21.928349 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 20:09:22.045729 master-0 kubenswrapper[9368]: I1203 20:09:22.045636 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045688 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045803 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config" (OuterVolumeSpecName: "config") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045814 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045893 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045913 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045965 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets" (OuterVolumeSpecName: "secrets") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.045998 master-0 kubenswrapper[9368]: I1203 20:09:22.045947 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046060 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046094 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046129 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046049 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046200 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046178 9368 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046092 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock" (OuterVolumeSpecName: "var-lock") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046177 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs" (OuterVolumeSpecName: "logs") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.046362 master-0 kubenswrapper[9368]: I1203 20:09:22.046221 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:22.046724 master-0 kubenswrapper[9368]: I1203 20:09:22.046687 9368 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046724 master-0 kubenswrapper[9368]: I1203 20:09:22.046716 9368 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046729 9368 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046741 9368 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046753 9368 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046763 9368 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046773 9368 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.046832 master-0 kubenswrapper[9368]: I1203 20:09:22.046802 9368 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.048813 master-0 kubenswrapper[9368]: I1203 20:09:22.048762 9368 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:09:22.147637 master-0 kubenswrapper[9368]: I1203 20:09:22.147558 9368 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:22.403507 master-0 kubenswrapper[9368]: I1203 20:09:22.403444 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1"} Dec 03 20:09:22.403507 master-0 kubenswrapper[9368]: I1203 20:09:22.403489 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9"} Dec 03 20:09:22.403507 master-0 kubenswrapper[9368]: I1203 20:09:22.403499 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859"} Dec 03 20:09:22.405285 master-0 kubenswrapper[9368]: I1203 20:09:22.405233 9368 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3" exitCode=0 Dec 03 20:09:22.405368 master-0 kubenswrapper[9368]: I1203 20:09:22.405336 9368 scope.go:117] "RemoveContainer" containerID="c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67" Dec 03 20:09:22.405522 master-0 kubenswrapper[9368]: I1203 20:09:22.405492 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 20:09:22.416131 master-0 kubenswrapper[9368]: I1203 20:09:22.412911 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"e73e6013-87fc-40e2-a573-39930828faa7","Type":"ContainerDied","Data":"458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614"} Dec 03 20:09:22.416131 master-0 kubenswrapper[9368]: I1203 20:09:22.412952 9368 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614" Dec 03 20:09:22.416131 master-0 kubenswrapper[9368]: I1203 20:09:22.413019 9368 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:22.417062 master-0 kubenswrapper[9368]: I1203 20:09:22.417039 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/3.log" Dec 03 20:09:22.420953 master-0 kubenswrapper[9368]: I1203 20:09:22.419137 9368 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/2.log" Dec 03 20:09:22.420953 master-0 kubenswrapper[9368]: I1203 20:09:22.420212 9368 generic.go:334] "Generic (PLEG): container finished" podID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" containerID="d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e" exitCode=1 Dec 03 20:09:22.420953 master-0 kubenswrapper[9368]: I1203 20:09:22.420893 9368 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerDied","Data":"d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e"} Dec 03 20:09:22.421666 master-0 kubenswrapper[9368]: I1203 20:09:22.421633 9368 scope.go:117] "RemoveContainer" containerID="d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e" Dec 03 20:09:22.421982 master-0 kubenswrapper[9368]: E1203 20:09:22.421945 9368 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-l7bzj_openshift-ingress-operator(3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" podUID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" Dec 03 20:09:22.429048 master-0 kubenswrapper[9368]: I1203 20:09:22.428915 9368 scope.go:117] "RemoveContainer" containerID="f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3" Dec 03 20:09:22.453125 master-0 kubenswrapper[9368]: I1203 20:09:22.453078 9368 scope.go:117] "RemoveContainer" containerID="0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6" Dec 03 20:09:22.489839 master-0 kubenswrapper[9368]: I1203 20:09:22.489680 9368 scope.go:117] "RemoveContainer" containerID="c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67" Dec 03 20:09:22.490271 master-0 kubenswrapper[9368]: E1203 20:09:22.490240 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67\": container with ID starting with c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67 not found: ID does not exist" containerID="c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67" Dec 03 20:09:22.490323 master-0 kubenswrapper[9368]: I1203 20:09:22.490281 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67"} err="failed to get container status \"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67\": rpc error: code = NotFound desc = could not find container \"c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67\": container with ID starting with c991f7eb8b7ab6bf0ec4d63b9e190d2c916d375ff6a09d91a53387bce5766b67 not found: ID does not exist" Dec 03 20:09:22.490323 master-0 kubenswrapper[9368]: I1203 20:09:22.490315 9368 scope.go:117] "RemoveContainer" containerID="f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3" Dec 03 20:09:22.490949 master-0 kubenswrapper[9368]: E1203 20:09:22.490919 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3\": container with ID starting with f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3 not found: ID does not exist" containerID="f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3" Dec 03 20:09:22.491010 master-0 kubenswrapper[9368]: I1203 20:09:22.490952 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3"} err="failed to get container status \"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3\": rpc error: code = NotFound desc = could not find container \"f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3\": container with ID starting with f5e393ad9e4b2248a04a2f1824c04ab00a09cf0b58d03e8aab531d5a360dcee3 not found: ID does not exist" Dec 03 20:09:22.491010 master-0 kubenswrapper[9368]: I1203 20:09:22.490982 9368 scope.go:117] "RemoveContainer" containerID="0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6" Dec 03 20:09:22.491410 master-0 kubenswrapper[9368]: E1203 20:09:22.491253 9368 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6\": container with ID starting with 0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6 not found: ID does not exist" containerID="0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6" Dec 03 20:09:22.491410 master-0 kubenswrapper[9368]: I1203 20:09:22.491280 9368 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6"} err="failed to get container status \"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6\": rpc error: code = NotFound desc = could not find container \"0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6\": container with ID starting with 0dd950185e59dc19fc3c4c25df60c0ffa205c3f9c227153b287f2a2e9b2b9bb6 not found: ID does not exist" Dec 03 20:09:22.491410 master-0 kubenswrapper[9368]: I1203 20:09:22.491293 9368 scope.go:117] "RemoveContainer" containerID="befef1f27ec31a7dce800c4fe3b217c928cd2c29d212afb9d75ef9e969b32b96" Dec 03 20:09:22.560531 master-0 kubenswrapper[9368]: I1203 20:09:22.560458 9368 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13238af3704fe583f617f61e755cf4c2" path="/var/lib/kubelet/pods/13238af3704fe583f617f61e755cf4c2/volumes" Dec 03 20:09:22.561007 master-0 kubenswrapper[9368]: I1203 20:09:22.560969 9368 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 20:09:25.036587 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 03 20:09:25.084001 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 03 20:09:25.084539 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 03 20:09:25.088340 master-0 systemd[1]: kubelet.service: Consumed 1min 49.879s CPU time. Dec 03 20:09:25.118618 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 20:09:25.247680 master-0 kubenswrapper[29252]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 20:09:25.248398 master-0 kubenswrapper[29252]: I1203 20:09:25.247810 29252 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 20:09:25.251069 master-0 kubenswrapper[29252]: W1203 20:09:25.251039 29252 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 20:09:25.251069 master-0 kubenswrapper[29252]: W1203 20:09:25.251058 29252 feature_gate.go:330] unrecognized feature gate: Example Dec 03 20:09:25.251069 master-0 kubenswrapper[29252]: W1203 20:09:25.251065 29252 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 20:09:25.251069 master-0 kubenswrapper[29252]: W1203 20:09:25.251071 29252 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 20:09:25.251069 master-0 kubenswrapper[29252]: W1203 20:09:25.251077 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251083 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251088 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251093 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251099 29252 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251105 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251109 29252 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251114 29252 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251119 29252 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251124 29252 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251130 29252 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251135 29252 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251140 29252 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251145 29252 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251151 29252 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251158 29252 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251164 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251170 29252 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251175 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251187 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 20:09:25.251288 master-0 kubenswrapper[29252]: W1203 20:09:25.251194 29252 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251200 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251205 29252 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251210 29252 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251216 29252 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251224 29252 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251229 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251235 29252 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251241 29252 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251246 29252 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251251 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251257 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251261 29252 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251267 29252 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251273 29252 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251278 29252 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251283 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251288 29252 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251293 29252 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251298 29252 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 20:09:25.252137 master-0 kubenswrapper[29252]: W1203 20:09:25.251303 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251309 29252 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251314 29252 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251319 29252 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251324 29252 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251329 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251335 29252 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251340 29252 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251345 29252 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251350 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251355 29252 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251363 29252 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251370 29252 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251376 29252 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251382 29252 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251387 29252 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251393 29252 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251399 29252 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251405 29252 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 20:09:25.252991 master-0 kubenswrapper[29252]: W1203 20:09:25.251411 29252 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251416 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251421 29252 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251428 29252 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251435 29252 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251441 29252 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251447 29252 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251452 29252 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: W1203 20:09:25.251457 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251573 29252 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251583 29252 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251592 29252 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251599 29252 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251606 29252 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251613 29252 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251620 29252 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251627 29252 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251633 29252 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251639 29252 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251646 29252 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251652 29252 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251657 29252 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 20:09:25.253615 master-0 kubenswrapper[29252]: I1203 20:09:25.251663 29252 flags.go:64] FLAG: --cgroup-root="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251671 29252 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251677 29252 flags.go:64] FLAG: --client-ca-file="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251683 29252 flags.go:64] FLAG: --cloud-config="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251689 29252 flags.go:64] FLAG: --cloud-provider="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251695 29252 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251702 29252 flags.go:64] FLAG: --cluster-domain="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251708 29252 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251714 29252 flags.go:64] FLAG: --config-dir="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251719 29252 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251725 29252 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251733 29252 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251738 29252 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251744 29252 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251750 29252 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251756 29252 flags.go:64] FLAG: --contention-profiling="false" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251763 29252 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251768 29252 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251794 29252 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251800 29252 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251807 29252 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251813 29252 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251818 29252 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251824 29252 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251829 29252 flags.go:64] FLAG: --enable-server="true" Dec 03 20:09:25.254426 master-0 kubenswrapper[29252]: I1203 20:09:25.251835 29252 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251841 29252 flags.go:64] FLAG: --event-burst="100" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251847 29252 flags.go:64] FLAG: --event-qps="50" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251853 29252 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251859 29252 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251864 29252 flags.go:64] FLAG: --eviction-hard="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251871 29252 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251877 29252 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251882 29252 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251888 29252 flags.go:64] FLAG: --eviction-soft="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251894 29252 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251899 29252 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251905 29252 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251910 29252 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251916 29252 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251922 29252 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251927 29252 flags.go:64] FLAG: --feature-gates="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251934 29252 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251940 29252 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251946 29252 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251952 29252 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251957 29252 flags.go:64] FLAG: --healthz-port="10248" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251963 29252 flags.go:64] FLAG: --help="false" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251969 29252 flags.go:64] FLAG: --hostname-override="" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251975 29252 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251980 29252 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 20:09:25.255551 master-0 kubenswrapper[29252]: I1203 20:09:25.251986 29252 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.251991 29252 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.251997 29252 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252003 29252 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252009 29252 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252014 29252 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252020 29252 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252025 29252 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252031 29252 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252037 29252 flags.go:64] FLAG: --kube-reserved="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252043 29252 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252048 29252 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252054 29252 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252059 29252 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252066 29252 flags.go:64] FLAG: --lock-file="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252071 29252 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252076 29252 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252083 29252 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252095 29252 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252101 29252 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252106 29252 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252112 29252 flags.go:64] FLAG: --logging-format="text" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252118 29252 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252124 29252 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252130 29252 flags.go:64] FLAG: --manifest-url="" Dec 03 20:09:25.257432 master-0 kubenswrapper[29252]: I1203 20:09:25.252136 29252 flags.go:64] FLAG: --manifest-url-header="" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252143 29252 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252149 29252 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252156 29252 flags.go:64] FLAG: --max-pods="110" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252162 29252 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252168 29252 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252174 29252 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252179 29252 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252185 29252 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252190 29252 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252196 29252 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252207 29252 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252219 29252 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252224 29252 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252231 29252 flags.go:64] FLAG: --pod-cidr="" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252236 29252 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252244 29252 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252250 29252 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252256 29252 flags.go:64] FLAG: --pods-per-core="0" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252261 29252 flags.go:64] FLAG: --port="10250" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252267 29252 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252272 29252 flags.go:64] FLAG: --provider-id="" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252278 29252 flags.go:64] FLAG: --qos-reserved="" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252283 29252 flags.go:64] FLAG: --read-only-port="10255" Dec 03 20:09:25.258701 master-0 kubenswrapper[29252]: I1203 20:09:25.252289 29252 flags.go:64] FLAG: --register-node="true" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252294 29252 flags.go:64] FLAG: --register-schedulable="true" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252300 29252 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252309 29252 flags.go:64] FLAG: --registry-burst="10" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252314 29252 flags.go:64] FLAG: --registry-qps="5" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252320 29252 flags.go:64] FLAG: --reserved-cpus="" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252325 29252 flags.go:64] FLAG: --reserved-memory="" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252332 29252 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252338 29252 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252344 29252 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252350 29252 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252356 29252 flags.go:64] FLAG: --runonce="false" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252362 29252 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252368 29252 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252374 29252 flags.go:64] FLAG: --seccomp-default="false" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252379 29252 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252385 29252 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252391 29252 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252397 29252 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252402 29252 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252408 29252 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252416 29252 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252422 29252 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252427 29252 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252433 29252 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 20:09:25.259732 master-0 kubenswrapper[29252]: I1203 20:09:25.252438 29252 flags.go:64] FLAG: --system-cgroups="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252444 29252 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252453 29252 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252458 29252 flags.go:64] FLAG: --tls-cert-file="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252464 29252 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252471 29252 flags.go:64] FLAG: --tls-min-version="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252477 29252 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252482 29252 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252489 29252 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252496 29252 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252501 29252 flags.go:64] FLAG: --v="2" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252510 29252 flags.go:64] FLAG: --version="false" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252518 29252 flags.go:64] FLAG: --vmodule="" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252524 29252 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: I1203 20:09:25.252530 29252 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252664 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252694 29252 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252701 29252 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252707 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252712 29252 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252718 29252 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252724 29252 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 20:09:25.260879 master-0 kubenswrapper[29252]: W1203 20:09:25.252729 29252 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252736 29252 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252742 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252747 29252 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252753 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252758 29252 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252766 29252 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252771 29252 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252775 29252 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252797 29252 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252802 29252 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252807 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252812 29252 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252817 29252 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252822 29252 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252828 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252833 29252 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252838 29252 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252843 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252848 29252 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 20:09:25.261684 master-0 kubenswrapper[29252]: W1203 20:09:25.252853 29252 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252857 29252 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252863 29252 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252868 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252873 29252 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252880 29252 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252885 29252 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252891 29252 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252897 29252 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252903 29252 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252908 29252 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252913 29252 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252918 29252 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252924 29252 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252930 29252 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252936 29252 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252941 29252 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252947 29252 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252954 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 20:09:25.262423 master-0 kubenswrapper[29252]: W1203 20:09:25.252959 29252 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252964 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252969 29252 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252974 29252 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252979 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252984 29252 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252989 29252 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252994 29252 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.252998 29252 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253004 29252 feature_gate.go:330] unrecognized feature gate: Example Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253009 29252 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253014 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253020 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253025 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253030 29252 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253035 29252 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253039 29252 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253044 29252 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253052 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253057 29252 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 20:09:25.263309 master-0 kubenswrapper[29252]: W1203 20:09:25.253062 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.253067 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.253072 29252 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.253077 29252 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.253082 29252 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.253087 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: I1203 20:09:25.253102 29252 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: I1203 20:09:25.259915 29252 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: I1203 20:09:25.259943 29252 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260044 29252 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260057 29252 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260067 29252 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260074 29252 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260082 29252 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260088 29252 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 20:09:25.264040 master-0 kubenswrapper[29252]: W1203 20:09:25.260095 29252 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260102 29252 feature_gate.go:330] unrecognized feature gate: Example Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260109 29252 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260116 29252 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260123 29252 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260130 29252 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260137 29252 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260145 29252 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260153 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260160 29252 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260170 29252 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260178 29252 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260185 29252 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260192 29252 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260198 29252 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260207 29252 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260217 29252 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260224 29252 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260230 29252 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260236 29252 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 20:09:25.264665 master-0 kubenswrapper[29252]: W1203 20:09:25.260241 29252 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260247 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260252 29252 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260258 29252 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260263 29252 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260269 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260274 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260279 29252 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260285 29252 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260290 29252 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260295 29252 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260301 29252 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260306 29252 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260311 29252 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260316 29252 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260322 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260327 29252 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260332 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260339 29252 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 20:09:25.265669 master-0 kubenswrapper[29252]: W1203 20:09:25.260346 29252 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260352 29252 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260358 29252 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260364 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260369 29252 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260376 29252 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260382 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260387 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260392 29252 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260397 29252 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260403 29252 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260408 29252 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260413 29252 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260418 29252 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260423 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260429 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260436 29252 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260442 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260449 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260455 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 20:09:25.266561 master-0 kubenswrapper[29252]: W1203 20:09:25.260460 29252 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260466 29252 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260472 29252 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260477 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260483 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260490 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260498 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: I1203 20:09:25.260508 29252 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260745 29252 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260758 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260765 29252 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260773 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260849 29252 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260865 29252 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260874 29252 feature_gate.go:330] unrecognized feature gate: Example Dec 03 20:09:25.267353 master-0 kubenswrapper[29252]: W1203 20:09:25.260882 29252 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260889 29252 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260896 29252 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260902 29252 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260907 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260913 29252 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260921 29252 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260927 29252 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260932 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260937 29252 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260942 29252 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260948 29252 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260953 29252 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260959 29252 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260965 29252 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260973 29252 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260983 29252 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.260991 29252 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.261000 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.261006 29252 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 20:09:25.268461 master-0 kubenswrapper[29252]: W1203 20:09:25.261013 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261020 29252 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261027 29252 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261033 29252 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261039 29252 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261046 29252 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261052 29252 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261058 29252 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261063 29252 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261068 29252 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261073 29252 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261079 29252 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261084 29252 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261089 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261095 29252 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261100 29252 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261105 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261110 29252 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261116 29252 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261121 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 20:09:25.269508 master-0 kubenswrapper[29252]: W1203 20:09:25.261126 29252 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261131 29252 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261144 29252 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261152 29252 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261159 29252 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261166 29252 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261173 29252 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261179 29252 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261186 29252 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261192 29252 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261198 29252 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261203 29252 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261209 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261214 29252 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261219 29252 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261224 29252 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261230 29252 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261235 29252 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261240 29252 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 20:09:25.270859 master-0 kubenswrapper[29252]: W1203 20:09:25.261245 29252 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: W1203 20:09:25.261250 29252 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: W1203 20:09:25.261256 29252 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: W1203 20:09:25.261262 29252 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: W1203 20:09:25.261267 29252 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: W1203 20:09:25.261272 29252 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.261280 29252 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.261466 29252 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.264136 29252 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.264316 29252 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.264840 29252 server.go:997] "Starting client certificate rotation" Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.264859 29252 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.265085 29252 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 15:03:38.909925982 +0000 UTC Dec 03 20:09:25.271539 master-0 kubenswrapper[29252]: I1203 20:09:25.265144 29252 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h54m13.644785939s for next certificate rotation Dec 03 20:09:25.275356 master-0 kubenswrapper[29252]: I1203 20:09:25.275310 29252 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 20:09:25.277150 master-0 kubenswrapper[29252]: I1203 20:09:25.277083 29252 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 20:09:25.286361 master-0 kubenswrapper[29252]: I1203 20:09:25.285629 29252 log.go:25] "Validated CRI v1 runtime API" Dec 03 20:09:25.290478 master-0 kubenswrapper[29252]: I1203 20:09:25.290436 29252 log.go:25] "Validated CRI v1 image API" Dec 03 20:09:25.292554 master-0 kubenswrapper[29252]: I1203 20:09:25.292515 29252 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 20:09:25.305431 master-0 kubenswrapper[29252]: I1203 20:09:25.305364 29252 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a110c2ad-b51b-427d-8eb4-4344f49e01ee:/dev/vda3] Dec 03 20:09:25.306219 master-0 kubenswrapper[29252]: I1203 20:09:25.305446 29252 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm major:0 minor:136 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd/userdata/shm major:0 minor:924 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9/userdata/shm major:0 minor:728 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm major:0 minor:342 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b9c573f7ba19dc3323c14093fb10a43f5d1d1f19bc23f8da28f974d65efe3f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b9c573f7ba19dc3323c14093fb10a43f5d1d1f19bc23f8da28f974d65efe3f1/userdata/shm major:0 minor:594 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm major:0 minor:315 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5/userdata/shm major:0 minor:730 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0d31ad42fdaaa8d9f4506f72df0676530f77957571a46716dc1e834dfef43d2c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0d31ad42fdaaa8d9f4506f72df0676530f77957571a46716dc1e834dfef43d2c/userdata/shm major:0 minor:505 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm major:0 minor:387 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm major:0 minor:171 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788/userdata/shm major:0 minor:954 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98/userdata/shm major:0 minor:116 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cd08c33a38d123c20d17a144cb73cdc913867f657f3ed47969c25f2ac5811c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cd08c33a38d123c20d17a144cb73cdc913867f657f3ed47969c25f2ac5811c9/userdata/shm major:0 minor:981 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4/userdata/shm major:0 minor:1010 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87/userdata/shm major:0 minor:1029 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ae6841c89bd0bc9cfc6015de7cc1e3a4bbed5c62b59fd91032790f9ed1aaac0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ae6841c89bd0bc9cfc6015de7cc1e3a4bbed5c62b59fd91032790f9ed1aaac0/userdata/shm major:0 minor:456 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473/userdata/shm major:0 minor:1365 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm major:0 minor:346 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3cff086346a7c5e3777cf149e0e1d8f97d1a0c5b1f9e52848dc132dcdccf253d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3cff086346a7c5e3777cf149e0e1d8f97d1a0c5b1f9e52848dc132dcdccf253d/userdata/shm major:0 minor:982 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb/userdata/shm major:0 minor:892 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm major:0 minor:320 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a1b2034d20b8550395063b65a0de0eddb16cb0c3a6fde052b4127e400052376/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a1b2034d20b8550395063b65a0de0eddb16cb0c3a6fde052b4127e400052376/userdata/shm major:0 minor:1006 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/52e10ffcc1fbdf8f2cb9d16e424d95ecef32b76b41b9a925005182a3b5446923/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/52e10ffcc1fbdf8f2cb9d16e424d95ecef32b76b41b9a925005182a3b5446923/userdata/shm major:0 minor:577 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5bee7d8031a36fa09960f186184717b2ac09e44e86995d183c886a9ab1dcdca8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5bee7d8031a36fa09960f186184717b2ac09e44e86995d183c886a9ab1dcdca8/userdata/shm major:0 minor:977 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6a5af31c4c1e2f84958d04c9531001f07d3ef520fdf16d375a2d25f61196cfa7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6a5af31c4c1e2f84958d04c9531001f07d3ef520fdf16d375a2d25f61196cfa7/userdata/shm major:0 minor:694 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7311eb8e0cfeb885addad4bf6c0ceae3553a0417b770ce4938a40cee85fb2dfd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7311eb8e0cfeb885addad4bf6c0ceae3553a0417b770ce4938a40cee85fb2dfd/userdata/shm major:0 minor:1017 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75ad2809d96a1369619e26966fceb45e6c13fc754c6dc35b21749d37ba20ab2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75ad2809d96a1369619e26966fceb45e6c13fc754c6dc35b21749d37ba20ab2a/userdata/shm major:0 minor:689 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2/userdata/shm major:0 minor:1007 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9/userdata/shm major:0 minor:81 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/85f7ddcd30f09f1a0fda67d2dbaf1344d49e468b4e45601d31e0dfb9ac188ad5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/85f7ddcd30f09f1a0fda67d2dbaf1344d49e468b4e45601d31e0dfb9ac188ad5/userdata/shm major:0 minor:1046 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm major:0 minor:358 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099/userdata/shm major:0 minor:961 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm major:0 minor:488 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a60557e4a853b254e1a52367430f6552fb59c31039de6af8378df26f94038fb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a60557e4a853b254e1a52367430f6552fb59c31039de6af8378df26f94038fb/userdata/shm major:0 minor:685 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79/userdata/shm major:0 minor:501 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm major:0 minor:324 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a/userdata/shm major:0 minor:522 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22/userdata/shm major:0 minor:678 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b0f89725c2a6c3514238a4cc365a81c3b56d37ffea32d9d0a2e9a1e91fecf2fb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b0f89725c2a6c3514238a4cc365a81c3b56d37ffea32d9d0a2e9a1e91fecf2fb/userdata/shm major:0 minor:629 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm major:0 minor:326 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034/userdata/shm major:0 minor:543 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm major:0 minor:163 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf/userdata/shm major:0 minor:587 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3/userdata/shm major:0 minor:1002 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bb807fb004e1c5a8c12ce908fa4f2effefa5e62f25142bb2fe3ec8dd74d140f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bb807fb004e1c5a8c12ce908fa4f2effefa5e62f25142bb2fe3ec8dd74d140f1/userdata/shm major:0 minor:108 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea/userdata/shm major:0 minor:679 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm major:0 minor:317 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm major:0 minor:121 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2/userdata/shm major:0 minor:1008 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13/userdata/shm major:0 minor:683 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425/userdata/shm major:0 minor:590 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c/userdata/shm major:0 minor:1260 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm major:0 minor:382 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dbc9d9f3c90ebc5bbbfe36c2028e07277634315bcc3781675056eb652072f16a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dbc9d9f3c90ebc5bbbfe36c2028e07277634315bcc3781675056eb652072f16a/userdata/shm major:0 minor:591 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dbe65295e2c898be586dca5d88680f9b16d8f0721a6e9ed04f2477053779cf26/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dbe65295e2c898be586dca5d88680f9b16d8f0721a6e9ed04f2477053779cf26/userdata/shm major:0 minor:576 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e2387dbfcc1d429cb65e949d260da12685f9167ab5d7e2e2846349bd7d4f915e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e2387dbfcc1d429cb65e949d260da12685f9167ab5d7e2e2846349bd7d4f915e/userdata/shm major:0 minor:680 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70/userdata/shm major:0 minor:586 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm major:0 minor:331 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm major:0 minor:122 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm:{mountpoint:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm major:0 minor:323 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~projected/kube-api-access-twlw5:{mountpoint:/var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~projected/kube-api-access-twlw5 major:0 minor:960 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:918 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd:{mountpoint:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp:{mountpoint:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp major:0 minor:286 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:674 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq:{mountpoint:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh:{mountpoint:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh major:0 minor:336 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~secret/metrics-tls major:0 minor:574 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~projected/kube-api-access-7cnmn:{mountpoint:/var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~projected/kube-api-access-7cnmn major:0 minor:367 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~secret/serving-cert major:0 minor:366 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/ca-certs major:0 minor:791 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/kube-api-access-95zsj:{mountpoint:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/kube-api-access-95zsj major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:790 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9/volumes/kubernetes.io~projected/kube-api-access-grk2s:{mountpoint:/var/lib/kubelet/pods/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9/volumes/kubernetes.io~projected/kube-api-access-grk2s major:0 minor:500 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq major:0 minor:170 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:169 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7:{mountpoint:/var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7 major:0 minor:487 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k:{mountpoint:/var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k major:0 minor:341 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:333 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4:{mountpoint:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~secret/metrics-tls major:0 minor:580 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~projected/kube-api-access-xwcj7:{mountpoint:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~projected/kube-api-access-xwcj7 major:0 minor:969 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cert major:0 minor:657 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:656 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85:{mountpoint:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85 major:0 minor:146 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~secret/metrics-certs major:0 minor:675 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f/volumes/kubernetes.io~projected/kube-api-access-vvlxr:{mountpoint:/var/lib/kubelet/pods/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f/volumes/kubernetes.io~projected/kube-api-access-vvlxr major:0 minor:455 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9:{mountpoint:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9 major:0 minor:340 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:575 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~projected/kube-api-access-x66sr:{mountpoint:/var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~projected/kube-api-access-x66sr major:0 minor:499 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~secret/signing-key major:0 minor:495 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~projected/kube-api-access-4d468:{mountpoint:/var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~projected/kube-api-access-4d468 major:0 minor:964 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:963 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~projected/kube-api-access-dl5h7:{mountpoint:/var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~projected/kube-api-access-dl5h7 major:0 minor:966 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:965 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6bb19329-c50c-4214-94c8-7e8771b99233/volumes/kubernetes.io~projected/kube-api-access-kszjr:{mountpoint:/var/lib/kubelet/pods/6bb19329-c50c-4214-94c8-7e8771b99233/volumes/kubernetes.io~projected/kube-api-access-kszjr major:0 minor:923 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb:{mountpoint:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb major:0 minor:119 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls major:0 minor:77 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/ca-certs major:0 minor:788 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/kube-api-access-7pf5q:{mountpoint:/var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/kube-api-access-7pf5q major:0 minor:789 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n:{mountpoint:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:444 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:442 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw:{mountpoint:/var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg:{mountpoint:/var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~projected/kube-api-access-qzd2g:{mountpoint:/var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~projected/kube-api-access-qzd2g major:0 minor:654 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~secret/proxy-tls major:0 minor:608 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~projected/kube-api-access-4wcmd:{mountpoint:/var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~projected/kube-api-access-4wcmd major:0 minor:953 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:900 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~projected/kube-api-access-c5dpx:{mountpoint:/var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~projected/kube-api-access-c5dpx major:0 minor:369 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~secret/proxy-tls major:0 minor:368 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph:{mountpoint:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf:{mountpoint:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/srv-cert major:0 minor:673 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a710102c-72fb-4d8d-ad99-71940368a09e/volumes/kubernetes.io~projected/kube-api-access-zgmkc:{mountpoint:/var/lib/kubelet/pods/a710102c-72fb-4d8d-ad99-71940368a09e/volumes/kubernetes.io~projected/kube-api-access-zgmkc major:0 minor:887 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2:{mountpoint:/var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2 major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~projected/kube-api-access-xjn9m:{mountpoint:/var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~projected/kube-api-access-xjn9m major:0 minor:987 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:1019 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~projected/kube-api-access-hdd6z:{mountpoint:/var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~projected/kube-api-access-hdd6z major:0 minor:974 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~secret/serving-cert major:0 minor:973 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~projected/kube-api-access-j6skg:{mountpoint:/var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~projected/kube-api-access-j6skg major:0 minor:970 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~secret/cert major:0 minor:655 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b638f207-31df-4298-8801-4da6031deefc/volumes/kubernetes.io~projected/kube-api-access-trv6b:{mountpoint:/var/lib/kubelet/pods/b638f207-31df-4298-8801-4da6031deefc/volumes/kubernetes.io~projected/kube-api-access-trv6b major:0 minor:553 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch:{mountpoint:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:672 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs:{mountpoint:/var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~projected/kube-api-access major:0 minor:879 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~secret/serving-cert major:0 minor:878 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4:{mountpoint:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4 major:0 minor:381 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:671 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~projected/kube-api-access-ljsr6:{mountpoint:/var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~projected/kube-api-access-ljsr6 major:0 minor:1315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82:{mountpoint:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82 major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~projected/kube-api-access-p7vxl:{mountpoint:/var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~projected/kube-api-access-p7vxl major:0 minor:511 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~secret/serving-cert major:0 minor:510 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~projected/kube-api-access-j2xcx:{mountpoint:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~projected/kube-api-access-j2xcx major:0 minor:666 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/encryption-config major:0 minor:664 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/etcd-client major:0 minor:665 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/serving-cert major:0 minor:726 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~projected/kube-api-access-qk5wb:{mountpoint:/var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~projected/kube-api-access-qk5wb major:0 minor:958 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:957 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2/volumes/kubernetes.io~projected/kube-api-access-7bdn5:{mountpoint:/var/lib/kubelet/pods/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2/volumes/kubernetes.io~projected/kube-api-access-7bdn5 major:0 minor:528 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~projected/kube-api-access-tphq2:{mountpoint:/var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~projected/kube-api-access-tphq2 major:0 minor:692 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:727 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9:{mountpoint:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9 major:0 minor:162 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:161 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln:{mountpoint:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4:{mountpoint:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4 major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collecto Dec 03 20:09:25.306539 master-0 kubenswrapper[29252]: r-cert major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/srv-cert major:0 minor:676 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:546 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/tmp major:0 minor:545 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~projected/kube-api-access-gs8fx:{mountpoint:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~projected/kube-api-access-gs8fx major:0 minor:547 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~projected/kube-api-access-k7t26:{mountpoint:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~projected/kube-api-access-k7t26 major:0 minor:990 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:988 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/webhook-cert major:0 minor:989 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8:{mountpoint:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8 major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access major:0 minor:322 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~projected/kube-api-access-lvklf:{mountpoint:/var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~projected/kube-api-access-lvklf major:0 minor:972 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:971 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~projected/kube-api-access-lkhn4:{mountpoint:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~projected/kube-api-access-lkhn4 major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/encryption-config major:0 minor:793 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/etcd-client major:0 minor:794 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/serving-cert major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4:{mountpoint:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4 major:0 minor:330 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} overlay_0-1001:{mountpoint:/var/lib/containers/storage/overlay/63111602c8c4edcb888bb6f007e8b6bf20337175a58059baa7aa94a10eb5500b/merged major:0 minor:1001 fsType:overlay blockSize:0} overlay_0-1004:{mountpoint:/var/lib/containers/storage/overlay/5e43176a9dc33baebac4f2e18b7b33927eb6acb05890d8815ab936237df58309/merged major:0 minor:1004 fsType:overlay blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/01e37568a24b7c43181d1c8e69b2af6c0939ace20defe1ff7efba28e5a329541/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-1021:{mountpoint:/var/lib/containers/storage/overlay/d265a00c872cac3e8dd9aa6157409171b4bff5f9d73401f93ec335e5768e2794/merged major:0 minor:1021 fsType:overlay blockSize:0} overlay_0-1023:{mountpoint:/var/lib/containers/storage/overlay/65f99402fbdb27a2afa2507e6a0800f6dd56d2b0f5cadcfefd7151fb0d3b1f58/merged major:0 minor:1023 fsType:overlay blockSize:0} overlay_0-1025:{mountpoint:/var/lib/containers/storage/overlay/2c0160bf228f12395c20b3d68195db19a2f205b420c908eac4cd898c533cbd63/merged major:0 minor:1025 fsType:overlay blockSize:0} overlay_0-1030:{mountpoint:/var/lib/containers/storage/overlay/822a59d2b905a65e39321c3bdfa3fdbc60c5cc5c709d3de5c88ad434257f262e/merged major:0 minor:1030 fsType:overlay blockSize:0} overlay_0-1038:{mountpoint:/var/lib/containers/storage/overlay/f3a5c9cd99c5e0c9b1ee8ce373ac8d367b3c4217b8e3c17930862e05fa21d59c/merged major:0 minor:1038 fsType:overlay blockSize:0} overlay_0-1040:{mountpoint:/var/lib/containers/storage/overlay/ad3d75ac9fce5497f7739291dcec17bb108cf0a1b6dfe882b28bd0bfc7480580/merged major:0 minor:1040 fsType:overlay blockSize:0} overlay_0-1043:{mountpoint:/var/lib/containers/storage/overlay/f801261085fe034f7bc769f7a8e9c84c2fb3cdf652ef738cd45874d029a119fb/merged major:0 minor:1043 fsType:overlay blockSize:0} overlay_0-1047:{mountpoint:/var/lib/containers/storage/overlay/cca5e5799c5ba0d7dcbd6f6ca52001bbb4761e778f5ac0e7abaaf49675db69ac/merged major:0 minor:1047 fsType:overlay blockSize:0} overlay_0-1049:{mountpoint:/var/lib/containers/storage/overlay/935a2e9688b6b04dc7c45d769d0a6a49cad2f7213a9b58ab613de7a94d1d4909/merged major:0 minor:1049 fsType:overlay blockSize:0} overlay_0-1054:{mountpoint:/var/lib/containers/storage/overlay/1b841a49a8e132cffa60596df98ad48a955a40d36b886e0b1c6135d81a76c40a/merged major:0 minor:1054 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/56a4440a527d425a7796159c9fcce0d2771fed4d65432e0e5b73390d568436de/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-1064:{mountpoint:/var/lib/containers/storage/overlay/ee7da236f598d958dcaf8fe1de038b0297422ccbf3b17d9f0c5d969e0814c681/merged major:0 minor:1064 fsType:overlay blockSize:0} overlay_0-1065:{mountpoint:/var/lib/containers/storage/overlay/8c235047b028ee116a338e7826b1bf220c3920b5833d7ac8c25f10bd6f0e88b8/merged major:0 minor:1065 fsType:overlay blockSize:0} overlay_0-1068:{mountpoint:/var/lib/containers/storage/overlay/f08caa057413b8496478bc251ee32da095a0e174ef7960818c43d851f5b90f48/merged major:0 minor:1068 fsType:overlay blockSize:0} overlay_0-1080:{mountpoint:/var/lib/containers/storage/overlay/0c59462b7586fd9fe5deb317d40861d67099d818dfdc7689a263270db795cbf0/merged major:0 minor:1080 fsType:overlay blockSize:0} overlay_0-1082:{mountpoint:/var/lib/containers/storage/overlay/4a61f3d8dbd1922cb7762cdf4e47b78e4ba5bcc1f243ea0689041ec29a9dc037/merged major:0 minor:1082 fsType:overlay blockSize:0} overlay_0-1088:{mountpoint:/var/lib/containers/storage/overlay/78d802cf9a965f5fde34899b8047b299243c14fe22c31d8641979a0dd2496e3a/merged major:0 minor:1088 fsType:overlay blockSize:0} overlay_0-109:{mountpoint:/var/lib/containers/storage/overlay/1b85ec28132cdef392de4e83d14a49f85dc21528edaa8afdf81c54b74b593fe0/merged major:0 minor:109 fsType:overlay blockSize:0} overlay_0-1101:{mountpoint:/var/lib/containers/storage/overlay/9f2d91429b99be236006201b622905478c15a21ff72e5d3a7ee7ae4af01dcc25/merged major:0 minor:1101 fsType:overlay blockSize:0} overlay_0-1103:{mountpoint:/var/lib/containers/storage/overlay/50dd37474447b9fd1460ca4850b113c0e60633ab5ac12415ae453e30af2550e3/merged major:0 minor:1103 fsType:overlay blockSize:0} overlay_0-1109:{mountpoint:/var/lib/containers/storage/overlay/36deccb4f83bd94368b82abaabec10e6cf9089cd549b718db2008f607bd15897/merged major:0 minor:1109 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/cbe1673e539bd587895d44729489ab994cd9e15c7173c9f0d6b2fb1336b615ea/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1111:{mountpoint:/var/lib/containers/storage/overlay/512e24da13b4a917485ff1322a937eb9b6d584314b6aa56def8bf449931e2a5c/merged major:0 minor:1111 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/e57ec1511f008d61ad921b8a7f676c691015f0d35f7ef2274d7d8aa68f42e3fd/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-114:{mountpoint:/var/lib/containers/storage/overlay/3d8561b911268da66974db44b73bcb362c8c72fc62ec472c7efaeb638dd8905c/merged major:0 minor:114 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/1ee7d18d6bf6d989111505b433cbcdc2734dd93380c6d33d352b1118d4a1d54a/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-1173:{mountpoint:/var/lib/containers/storage/overlay/b5d5a0c11c945c8a2c88bdb87facf4df31edd23caa98a9a35c51bda549024213/merged major:0 minor:1173 fsType:overlay blockSize:0} overlay_0-1175:{mountpoint:/var/lib/containers/storage/overlay/7666eb4112d9c0e8ea66648b7bf32739365bc327cb559a1e163fcca8a1436657/merged major:0 minor:1175 fsType:overlay blockSize:0} overlay_0-1176:{mountpoint:/var/lib/containers/storage/overlay/1687efe965cf3c6a067aae687404a39630657712681e0008ffea1cbb084c472c/merged major:0 minor:1176 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/fce577da20e0aab384b42a260ff6bdabc14d3f3500a236d2768fea074ab3f874/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-1202:{mountpoint:/var/lib/containers/storage/overlay/9cff85f9f5186ecc0de5a2133ed4682777c75b0e42cce84776c323664433f8bd/merged major:0 minor:1202 fsType:overlay blockSize:0} overlay_0-1214:{mountpoint:/var/lib/containers/storage/overlay/2d2d2b4e7d0548d0daead3c417b5e840c0787569acb864cb59a781d7541a27aa/merged major:0 minor:1214 fsType:overlay blockSize:0} overlay_0-1224:{mountpoint:/var/lib/containers/storage/overlay/e12fe082841f37fa5832154e04911b08f177058df80d0f654d316271e165ffab/merged major:0 minor:1224 fsType:overlay blockSize:0} overlay_0-1226:{mountpoint:/var/lib/containers/storage/overlay/0459b3bcd9228c81f418c41fc585d0ebe58474d21563a4a1a1487fb5c3281b6a/merged major:0 minor:1226 fsType:overlay blockSize:0} overlay_0-124:{mountpoint:/var/lib/containers/storage/overlay/7553b136442b262bdba3952c9950e5fb24caa30dcae9fe6cc01c85b7e958db26/merged major:0 minor:124 fsType:overlay blockSize:0} overlay_0-1242:{mountpoint:/var/lib/containers/storage/overlay/d7a60abb5c829d116be45abca84b7295b18a0d329b8c590cd1dfb9d5972fafe8/merged major:0 minor:1242 fsType:overlay blockSize:0} overlay_0-1252:{mountpoint:/var/lib/containers/storage/overlay/dac6390b426e15dabb208b86d5121ec7ed8fb0980517bec44d0c4e5b2267db6a/merged major:0 minor:1252 fsType:overlay blockSize:0} overlay_0-1263:{mountpoint:/var/lib/containers/storage/overlay/878df61d64dad21525efdf79fe02f90916f6b542e84973e7046edc834ad9b90a/merged major:0 minor:1263 fsType:overlay blockSize:0} overlay_0-1265:{mountpoint:/var/lib/containers/storage/overlay/8d7a0c97048978b33bb5bea7dca8bf555e3be6757f2e86236b32c32e637afc50/merged major:0 minor:1265 fsType:overlay blockSize:0} overlay_0-1268:{mountpoint:/var/lib/containers/storage/overlay/b5b330943dfecba1e95eb41ced51a1c62b3334cb7f0bdc576db5c7e6fc4afca9/merged major:0 minor:1268 fsType:overlay blockSize:0} overlay_0-127:{mountpoint:/var/lib/containers/storage/overlay/0bd69db7ee13ba17fdb90b4e845e0b08823660e03858a2fefa9c745e7a3260b8/merged major:0 minor:127 fsType:overlay blockSize:0} overlay_0-1277:{mountpoint:/var/lib/containers/storage/overlay/59def15987d098501c661954b298eda038fde0e697d512623855c9f12fe2f993/merged major:0 minor:1277 fsType:overlay blockSize:0} overlay_0-1297:{mountpoint:/var/lib/containers/storage/overlay/e4dbb2e24fc33405b5e2af14f77d780e66fcb7c1890a86328258c565255fb882/merged major:0 minor:1297 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/33be16944a7983ef6877fa6ecb1741f0131ca67e6621f25eed85c8c0e002a597/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/1162b9d5b06f470702e57ad9978e7ed92220a1d2cdf231f28dc981aa42a42acb/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-1316:{mountpoint:/var/lib/containers/storage/overlay/3cada3899159da43c6869d9d180b33ededc9d886cce34814f7edb1184efacbf0/merged major:0 minor:1316 fsType:overlay blockSize:0} overlay_0-1328:{mountpoint:/var/lib/containers/storage/overlay/e2c6c27d49c1b6dca6d316598170d5689d2e9ec1a9b9667ceb5c0c5837058f2f/merged major:0 minor:1328 fsType:overlay blockSize:0} overlay_0-133:{mountpoint:/var/lib/containers/storage/overlay/26150a8e6e01d5fd60c9f398dff5b9d156e2656347cf26addc9ff8e67765e6a4/merged major:0 minor:133 fsType:overlay blockSize:0} overlay_0-1335:{mountpoint:/var/lib/containers/storage/overlay/bc1f8990dbffdf04f82cecf6676dc1657ab0bab5d1279447dc8723457330bb72/merged major:0 minor:1335 fsType:overlay blockSize:0} overlay_0-1343:{mountpoint:/var/lib/containers/storage/overlay/63c7c5031848975a542840f92f16e7079db8edbd5534022bf3dc0a01bd5aab4d/merged major:0 minor:1343 fsType:overlay blockSize:0} overlay_0-1348:{mountpoint:/var/lib/containers/storage/overlay/7612ad6ce5a589e42a80ef67ca8627528ad343c3f9b8cc5be93d4416191b5a5a/merged major:0 minor:1348 fsType:overlay blockSize:0} overlay_0-1353:{mountpoint:/var/lib/containers/storage/overlay/6e6bcbce25b3f35bab3d08f377e80f595999f8a9e08befebe9a8313dff6ea044/merged major:0 minor:1353 fsType:overlay blockSize:0} overlay_0-1358:{mountpoint:/var/lib/containers/storage/overlay/d364f50402113c8f60a461e5ca135e7a8a54e5be12953a4135a3405ae60dfa03/merged major:0 minor:1358 fsType:overlay blockSize:0} overlay_0-1367:{mountpoint:/var/lib/containers/storage/overlay/f238d44e1bfe45eca47a9c85bc8c98be9ed310dae27ff3f4ccb055c86763a60d/merged major:0 minor:1367 fsType:overlay blockSize:0} overlay_0-1369:{mountpoint:/var/lib/containers/storage/overlay/2c9a655a87c029af27758eaa3716e989b086b5c3af8c6d8119d442303e505a09/merged major:0 minor:1369 fsType:overlay blockSize:0} overlay_0-1378:{mountpoint:/var/lib/containers/storage/overlay/d90b705a26b451ffc75d56555522e84a22a55f05ad600f52dbd0794233e464d1/merged major:0 minor:1378 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/1004c29d4dc2e5ae9f71f2b283a0161528285c854807a1dbdf6ef884115bb780/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-1388:{mountpoint:/var/lib/containers/storage/overlay/8a521a998b95d346a0dc7368a252344ec64071a5466688206c2b2357fe6871a9/merged major:0 minor:1388 fsType:overlay blockSize:0} overlay_0-1391:{mountpoint:/var/lib/containers/storage/overlay/c48e5b60449837404c77134974e194e080fed1eff4b759c91dd076e6b87392f5/merged major:0 minor:1391 fsType:overlay blockSize:0} overlay_0-1394:{mountpoint:/var/lib/containers/storage/overlay/032451e284bda0f8975df398c02f9169c48b10214edd0a3dd331aa786c1b80e1/merged major:0 minor:1394 fsType:overlay blockSize:0} overlay_0-1400:{mountpoint:/var/lib/containers/storage/overlay/4200df4db72bc843bac40b577683eba9d746245c6989bb2018560ac447feb290/merged major:0 minor:1400 fsType:overlay blockSize:0} overlay_0-1409:{mountpoint:/var/lib/containers/storage/overlay/fb65903d4087ccb61c707882a5769afca2cf868be083eac4cfc7e9c039c362b8/merged major:0 minor:1409 fsType:overlay blockSize:0} overlay_0-143:{mountpoint:/var/lib/containers/storage/overlay/c27a20388be74924149d9e99531bb8b65e6a712271ec59b3be04a8998d59dedf/merged major:0 minor:143 fsType:overlay blockSize:0} overlay_0-147:{mountpoint:/var/lib/containers/storage/overlay/f75f2289eb91a86ce692ca0d840dbbe974ddb5b8c357cd7b938271cbfa785b82/merged major:0 minor:147 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/bc4f71e8b8d82e0ad27bd2f134fd5975227dcc209d49bd614753fdc0e79baf93/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-151:{mountpoint:/var/lib/containers/storage/overlay/c071acf2c42a58f5fd09cb85c464730b75ab6dff05815d5752fc0606b1c0ccc0/merged major:0 minor:151 fsType:overlay blockSize:0} overlay_0-153:{mountpoint:/var/lib/containers/storage/overlay/80c1684eb890fb1daa732afcbbd75db2b619a73b440b9d8c2bccc078dc877a36/merged major:0 minor:153 fsType:overlay blockSize:0} overlay_0-155:{mountpoint:/var/lib/containers/storage/overlay/5753da5558b9f8596dab87b5dbf7d299de3c68169d41040c325e68f583623825/merged major:0 minor:155 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/2cbd7bbce53634c485921b252dc3061642d0855d651b865add08036c22585321/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/e2a2a7fd2682f648d30a42254b47101c842bf68b8e2a60186ad746707892f76a/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-167:{mountpoint:/var/lib/containers/storage/overlay/75e8e9d8e8519bd4cf96fd700f21930d4d0bd5b668b5431b8337458011a7d74d/merged major:0 minor:167 fsType:overlay blockSize:0} overlay_0-173:{mountpoint:/var/lib/containers/storage/overlay/42ba82f93953383102c71431ae42faf4e61b9734194f958c695db0e7cf201504/merged major:0 minor:173 fsType:overlay blockSize:0} overlay_0-181:{mountpoint:/var/lib/containers/storage/overlay/5c3360d970e6e048298c4d9620de740a87fb1a67a55381dd871c6a9829c8b9b9/merged major:0 minor:181 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/a565e58719d5970fc9bfb6b872797b60ec22a84ef0b2761fd6fa3c88d1962e13/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/75ddef7480f281ec07ceca3ce28f6c684a389b0b57e289673849e30c4b0081b6/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/749a2a069dd05cd62cd3d08ecace9255c2b9f5f83cdaf0801c5159793285337d/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/062819f0492cedc7321aae8a6f04970c6d37225b8e9e71e76c4703e921cdc989/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/154db8920aed66f693824269968b5900519a3200f2964785bebbd77a8f93591a/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/658fa1657c4947ab3420052e1438ec6755e21ef571576b9538fe1278ca214b9c/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-203:{mountpoint:/var/lib/containers/storage/overlay/60843bf32c8aa133f0a2c32e13959689f512f8856189f09a47b09557de2e9480/merged major:0 minor:203 fsType:overlay blockSize:0} overlay_0-211:{mountpoint:/var/lib/containers/storage/overlay/cd0e0c415170062505067cde2da0b5e6ec7259656863f73159217b9563cef5a3/merged major:0 minor:211 fsType:overlay blockSize:0} overlay_0-213:{mountpoint:/var/lib/containers/storage/overlay/c32c9e29ea7b415b5875cc854cecf3e7a5b29c7b52a830d1035bdf00878f2553/merged major:0 minor:213 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/776841f50d586cda6eb39e144487c3c0b98792ce5a700475b7a9dab55c779a3d/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-225:{mountpoint:/var/lib/containers/storage/overlay/92d90c3e4a060698dd05ebe53f21f82c4edacf6689865a8436eb80b0c2482993/merged major:0 minor:225 fsType:overlay blockSize:0} overlay_0-226:{mountpoint:/var/lib/containers/storage/overlay/9febea4b01b238ed11f90be195d61cfa2c2d3d684569bbcee9a4b95a9589e32a/merged major:0 minor:226 fsType:overlay blockSize:0} overlay_0-234:{mountpoint:/var/lib/containers/storage/overlay/826fb9274729a04ac8bf1f2dc7809bbfdcd1ef2d903ce8925e12a96c0d54ccdc/merged major:0 minor:234 fsType:overlay blockSize:0} overlay_0-242:{mountpoint:/var/lib/containers/storage/overlay/3ea75403be9ff49f0d89e395fb71c8afdcf31c2dca4bf3be27c0bfc9c2b02b1f/merged major:0 minor:242 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/542f68d90ba74a108bed62852741fecae67d09e6d48c00aab4641f44fc4d9b27/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-258:{mountpoint:/var/lib/containers/storage/overlay/577a4d9cb019e724a8e12ef1daee525d6fa1600006aef5edbb01e7dcd910283b/merged major:0 minor:258 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/9507e651f29e86bf69ac5f0fa8947bc090d60c10765e92eaecfe6ed356113cd0/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-271:{mountpoint:/var/lib/containers/storage/overlay/c560ad2fe306a287e3b7e2af00ede38dc48fd77694968e2249e1766389296e9d/merged major:0 minor:271 fsType:overlay blockSize:0} overlay_0-344:{mountpoint:/var/lib/containers/storage/overlay/0d2e73df435b5c0a11afee4c921f2059332b9ae28d1942e279e4876055cb3c49/merged major:0 minor:344 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/d7cf977cc989de992a217f0075826286da5e96a4a849f319fedd84ce7fe63bf2/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/0b6b91b67dfb1ed892203184f5fcf8a0515e22da4351249925d088a82543de78/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-354:{mountpoint:/var/lib/containers/storage/overlay/a4cf883831a1b1ad73ed376a48b26380e5d77283245da5c25e12fd1684801427/merged major:0 minor:354 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/357df791a42b82a3f0a028f1fa70e430ff1f4762ddb53dc6da86b2c969315824/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-360:{mountpoint:/var/lib/containers/storage/overlay/7ddddda0c20864a2ba21f7bd6bdd3684a476ddec7c8904bba715aedfb231f085/merged major:0 minor:360 fsType:overlay blockSize:0} overlay_0-362:{mountpoint:/var/lib/containers/storage/overlay/63e4b31005f005596c90661bcd11ea6b0d7f6763d3f48f6674ba2f76d9631872/merged major:0 minor:362 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/d92751fea002c7312927f4c988ef66b0e5f5e32a423ef53eae60aa655aff8b4f/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-372:{mountpoint:/var/lib/containers/storage/overlay/910b87c15ee1b84ed4b7d1541fbcae1f0a9aa5263509a81babb54c425ac0b23d/merged major:0 minor:372 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/714b90d43dafb90c0316a8aa5770ffb0680b71be5ea6c56fa1bdbbe88e555ed6/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/cb02664550bde6617b9760e31c58cab812be593725e7360e72a4a1b06c7d0aeb/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/739bca9868adbe0ae08dc4d285f26c54badb979e34da856ec7aacdd9e8939bdd/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-379:{mountpoint:/var/lib/containers/storage/overlay/742873d59edab48b4d0150a0ca9e690e84ffd20e86825d4e39fd5bcb9ebc0795/merged major:0 minor:379 fsType:overlay blockSize:0} overlay_0-384:{mountpoint:/var/lib/containers/storage/overlay/2d66b49203bddda94e368a202d17e06d683060a7ebb9ae7ecaf7e896b9519926/merged major:0 minor:384 fsType:overlay blockSize:0} overlay_0-389:{mountpoint:/var/lib/containers/storage/overlay/179a8de0504b96d2752b8ba627058b94dc8278fcb9fb719b11253669c5697a34/merged major:0 minor:389 fsType:overlay blockSize:0} overlay_0-391:{mountpoint:/var/lib/containers/storage/overlay/5fcfca9079ea12684bfa9058df007d8a08f27b18f97257ee161629dabeaf4d8c/merged major:0 minor:391 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/1a9094f111e5eeaaeabbfc2d923f4fee973bb6f2c6c334dd266236254251b9f6/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-397:{mountpoint:/var/lib/containers/storage/overlay/6e231efd86645ccdfabd834613a2bd1d9115a630f35f9ae0be27cc8cdd41f2e2/merged major:0 minor:397 fsType:overlay blockSize:0} overlay_0-401:{mountpoint:/var/lib/containers/storage/overlay/6b60f943d6a97efc8396b449b0e4d72c0b6ec49613766651fdbb3985eb916878/merged major:0 minor:401 fsType:overlay blockSize:0} overlay_0-403:{mountpoint:/var/lib/containers/storage/overlay/2a24011b0d4b7e16b033ce4c137352b6949d03a9bcaa817eb0c174722f33f7d9/merged major:0 minor:403 fsType:overlay blockSize:0} overlay_0-405:{mountpoint:/var/lib/containers/storage/overlay/5787ab1d83218ba7a52a79eb7e357f2b6cbfc3295713be815838e380f425101e/merged major:0 minor:405 fsType:overlay blockSize:0} overlay_0-407:{mountpoint:/var/lib/containers/storage/overlay/0a1a2c379b383ef069b60697c39d6bb7d027046834e8c1f38cef4a4eca8cf7ef/merged major:0 minor:407 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/df84e1149e0c5f0cf23a821ed142923d4c8d8220299200650c098b90790fcb10/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/c48d3c74c1d3ffa6bb26ca709712ad4594bc58102cb0c90a7c8fc02345157862/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/a0ac6d0acd420fb457c8def6a16b56761986ac41206171c86cf2b7702536f82b/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-413:{mountpoint:/var/lib/containers/storage/overlay/2a903ec6b6393a8996683c3aef9b856523c61c2cadc21b5305b07d186d99c0fd/merged major:0 minor:413 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/477cf05012a86f639dab9e51e3815d6851bfd0e86bd861a8a84fc2288baed2f1/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-423:{mountpoint:/var/lib/containers/storage/overlay/68e762344dcb1a3de566e4f070c48af825afc479e1c7d52abe903375386d6152/merged major:0 minor:423 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/211fa5539d3ea0c158d2762c20508a21e97d08f682369e5fc3f2c59d51500c6a/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-427:{mountpoint:/var/lib/containers/storage/overlay/e146c774a4c5b7c0d07cebc46ea50fbd16552b808c5d049e6d88477a8f1a642e/merged major:0 minor:427 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/6db36944aa97848b1163a8f35f5a88f701dc80873a1f4bc05f32d52978679626/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-431:{mountpoint:/var/lib/containers/storage/overlay/241b6aef283f5f15fecaefd51c25d690b2ab9f0b7afffbca939f294fccd5e3b4/merged major:0 minor:431 fsType:overlay blockSize:0} overlay_0-437:{mountpoint:/var/lib/containers/storage/overlay/fbe5f947af2adae6e4e315718c739288dc5f9b3fcff773f04f51d8f850be0bb7/merged major:0 minor:437 fsType:overlay blockSize:0} overlay_0-440:{mountpoint:/var/lib/containers/storage/overlay/2365ffaa2ad0a8f85b1280893d9981731599e6069b9939c96454d28dd7e072d0/merged major:0 minor:440 fsType:overlay blockSize:0} overlay_0-451:{mountpoint:/var/lib/containers/storage/overlay/d48b920cf31e92b14105876adb660adef6847664f434262819044c910cb61703/merged major:0 minor:451 fsType:overlay blockSize:0} overlay_0-453:{mountpoint:/var/lib/containers/storage/overlay/5aa995edf265e5dd3e5f037ace7f3dc11f93b3105c022f53e8909076648c68cb/merged major:0 minor:453 fsType:overlay blockSize:0} overlay_0-464:{mountpoint:/var/lib/containers/storage/overlay/471ec69992caa0b1b845d41437c15f44775969c412a211eb391306aa80616e62/merged major:0 minor:464 fsType:overlay blockSize:0} overlay_0-468:{mountpoint:/var/lib/containers/storage/overlay/13f4192c247153170e008fe0df0aed5341b10979216093db67406c3e2938f2a3/merged major:0 minor:468 fsType:overlay blockSize:0} overlay_0-469:{mountpoint:/var/lib/containers/storage/overlay/fcadd55d474b4729398f24c0eeab4955a838c8d35c28185a03e92c0c90e2ba45/merged major:0 minor:469 fsType:overlay blockSize:0} overlay_0-471:{mountpoint:/var/lib/containers/storage/overlay/dc93e6acdbc03b74f7e6df37125e0ef88c992d5e346855c6b5313ba2a1999092/merged major:0 minor:471 fsType:overlay blockSize:0} overlay_0-478:{mountpoint:/var/lib/containers/storage/overlay/44d4c16a44479fadbb6b64debcd9de88d9d2b9bd32e706674591b238bce78a90/merged major:0 minor:478 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/2dd6298b1f72ec8bb4bd5f5f0e5b4e4e6e4ecf73006a75b0db00abf999b86496/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-493:{mountpoint:/var/lib/containers/storage/overlay/bca6f8ad83aa8af83dc4e443d7ebeefff8f190eab6e997980b50606d2374f457/merged major:0 minor:493 fsType:overlay blockSize:0} overlay_0-496:{mountpoint:/var/lib/containers/storage/overlay/43d54ba953cbc66470de20d8b88bf3ed0b0a5fe3ce8ed08ce5c5770c126db79d/merged major:0 minor:496 fsType:overlay blockSize:0} overlay_0-498:{mountpoint:/var/lib/containers/storage/overlay/1e47e54300d4aa4d560d8f56fb97e75af46b4b21623a06fc234f48b2cda648e2/merged major:0 minor:498 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/a605689b5a5f7b6696e18721c6545ed4b0547d0e46c1da85e24e88c6cfd06ba6/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/5ca6ab21bbec5c16722d0ed6cf161ebcf560bad259214c7c0a856458658c21a0/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-507:{mountpoint:/var/lib/containers/storage/overlay/53c4c9c63be7832785de56706f327f98b36841ff5283ae2bd2a416e1ef057f78/merged major:0 minor:507 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/ea74bf2e540c7db64605bd112dfdd1d9d6efc322d521ff6cf6b9ce70bcec5151/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/5d24eb99465f74f0c8f2ad0d7ff4f8748bfa4dd50b399ef6f5ffdb4f856fd950/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/1545058f2b5721bf1876659dd8a9ecb73d205299e610cb2e4ed4254c48cce3ae/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/b2dea036016d571e7d3cea598b5eedf463e6592623c5b3db54cf3be95f56968b/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-525:{mountpoint:/var/lib/containers/storage/overlay/67cc7b45d5bfdc775f85232683103b359ce1ecda8d7db0f9a19248a87aaee11c/merged major:0 minor:525 fsType:overlay blockSize:0} overlay_0-537:{mountpoint:/var/lib/containers/storage/overlay/601f36117b6906c6ef03117fcf2298765c41616824535e35b1e81be899bb670c/merged major:0 minor:537 fsType:overlay blockSize:0} overlay_0-548:{mountpoint:/var/lib/containers/storage/overlay/6bbb392e64fee0a87471cb18ffd731ad0bd3285aecd4e9c983a1bfe6faad9ff2/merged major:0 minor:548 fsType:overlay blockSize:0} overlay_0-554:{mountpoint:/var/lib/containers/storage/overlay/d693d53b88c88bca0585c1cace3a6f86c58fccd5714816c974ecaffdd36c1b59/merged major:0 minor:554 fsType:overlay blockSize:0} overlay_0-555:{mountpoint:/var/lib/containers/storage/overlay/48d11bac718bfcb479a0742befb9f8c2711e8bf7b8d94b1ea11af67e7050523f/merged major:0 minor:555 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/5d40379c82f8418c190a1b69f478e18dc68767b61b478bac80b096ee263ae577/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-563:{mountpoint:/var/lib/containers/storage/overlay/ade80f54000d03fe270509347efb29f690680aa5408a127daca1342ffc63ad44/merged major:0 minor:563 fsType:overlay blockSize:0} overlay_0-564:{mountpoint:/var/lib/containers/storage/overlay/8700074037c640b479cf96441924af64903d3ff20c2e07acce36cd28c574ddf5/merged major:0 minor:564 fsType:overlay blockSize:0} overlay_0-582:{mountpoint:/var/lib/containers/storage/overlay/cc8cdcc788efe165b75d0e763388639f5195258884ad93321fb072ce3c780d53/merged major:0 minor:582 fsType:overlay blockSize:0} overlay_0-584:{mountpoint:/var/lib/containers/storage/overlay/6c703dd86c8c7f21945b09c52e427e778c9615dc21eaf4289eaa655dabc8a31e/merged major:0 minor:584 fsType:overlay blockSize:0} overlay_0-592:{mountpoint:/var/lib/containers/storage/overlay/083864c283b37ca2d49e2c95e62d362fac836ae08c0638d6b4dc61ce58f6cbfc/merged major:0 minor:592 fsType:overlay blockSize:0} overlay_0-596:{mountpoint:/var/lib/containers/storage/overlay/95c245c3334a2351f9ed66726f4cee3ea019e8c2b13170cfb79877c77da54141/merged major:0 minor:596 fsType:overlay blockSize:0} overlay_0-598:{mountpoint:/var/lib/containers/storage/overlay/78861697d77fa753d2ac843a32c871b3807a735b814822ff4684d82a7ee87226/merged major:0 minor:598 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/08963768f75114cd9191d22721f1a1a86b585929e3b1a7b5747ff378a14dde17/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-602:{mountpoint:/var/lib/containers/storage/overlay/104a04eb04a80a2a342e2d0367856bf87dd79d3823a30713330518e8f99697ef/merged major:0 minor:602 fsType:overlay blockSize:0} overlay_0-606:{mountpoint:/var/lib/containers/storage/overlay/14af3002e7845ba87690618b549310197c2c85fb8f7336dcaa5080ea363b235e/merged major:0 minor:606 fsType:overlay blockSize:0} overlay_0-611:{mountpoint:/var/lib/containers/storage/overlay/ad8b814f70e5186a60f6ecf13b29a087e92d614b94dc27dcd7ff91e97331e029/merged major:0 minor:611 fsType:overlay blockSize:0} overlay_0-613:{mountpoint:/var/lib/containers/storage/overlay/b502f79e4d8c2464c4b05d50ef39bf6b51a8429ac79ee68516e12a172d7fce78/merged major:0 minor:613 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/9aefb9f47e3c8709a30a7a83982b435dcafddb2345f1a76614c1108a44505ff4/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-624:{mountpoint:/var/lib/containers/storage/overlay/eb02a6076ef7fadd8259f714c486b7fc034c6edeec3003af8404d6b30e8b73e4/merged major:0 minor:624 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/4fbeebe590b2014ecaa45727b008c42b458b3191412831f03d6c37bd748add61/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-644:{mountpoint:/var/lib/containers/storage/overlay/0656e3e46bc6591c5d8f2d9e8c9c49d5f1f4c2d8bfa2231a955e86b4e880d8f1/merged major:0 minor:644 fsType:overlay blockSize:0} overlay_0-646:{mountpoint:/var/lib/containers/storage/overlay/3584c68a2e3b7fb9f8608f6775c8a1b32725b61feeb16d2727e9342b2a008212/merged major:0 minor:646 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/db1ff69b43fcc3c4bc631c31c54a76e0f7e13a2993a3ed313c135b7753e51c8e/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-667:{mountpoint:/var/lib/containers/storage/overlay/1b0ba331f1c6fe06638bf8abc85a39c6ccca20524cbe750d3c44b234ac8e89e2/merged major:0 minor:667 fsType:overlay blockSize:0} overlay_0-669:{mountpoint:/var/lib/containers/storage/overlay/f09e59b93af321b4cc73e608889f107331c764da0684d119eb6c160a2760105e/merged major:0 minor:669 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/adf8bd5cee8057960e17eedccea07e9482024f954d681363c0700334cfc561fa/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-696:{mountpoint:/var/lib/containers/storage/overlay/aecaffa007727d8859917b209bb08f42ab99b0bbd34861df43ae44680e136797/merged major:0 minor:696 fsType:overlay blockSize:0} overlay_0-698:{mountpoint:/var/lib/containers/storage/overlay/2ee68e4e2309b9d04417b768374814c6de4af78777b294d509c6bacfe860ecb3/merged major:0 minor:698 fsType:overlay blockSize:0} overlay_0-700:{mountpoint:/var/lib/containers/storage/overlay/a087560eca262a4f9cf14a0e0463ac1ace5c8ed822d1920106fb989c0ffe4574/merged major:0 minor:700 fsType:overlay blockSize:0} overlay_0-702:{mountpoint:/var/lib/containers/storage/overlay/7c9b26cfd022e610b6c0f7b607daf559a0e13104ef81d6b28e2ba5eb8e739807/merged major:0 minor:702 fsType:overlay blockSize:0} overlay_0-704:{mountpoint:/var/lib/containers/storage/overlay/1bc08e126f79df452e26d5a25ada3244a79d441f4eb2396626ecb497f20afc17/merged major:0 minor:704 fsType:overlay blockSize:0} overlay_0-706:{mountpoint:/var/lib/containers/storage/overlay/75c2054d5e2c1f41b3e2f141b6630ab3ea843ac41a6d617175f4d4698f052901/merged major:0 minor:706 fsType:overlay blockSize:0} overlay_0-708:{mountpoint:/var/lib/containers/storage/overlay/c797e5975518b8074a415bab71fe94563b204f3ce3c00fe890678a6ef3e2d98d/merged major:0 minor:708 fsType:overlay blockSize:0} overlay_0-712:{mountpoint:/var/lib/containers/storage/overlay/15ea851f235e2ce497e7835e307517f744dbb393d7ab63a13e1d4bb4b39f59d0/merged major:0 minor:712 fsType:overlay blockSize:0} overlay_0-725:{mountpoint:/var/lib/containers/storage/overlay/7b36fecc3d1b52c1eef3dbcd55b70b901ed3abb6bfa57738a942a10558706743/merged major:0 minor:725 fsType:overlay blockSize:0} overlay_0-732:{mountpoint:/var/lib/containers/storage/overlay/2ac61d7d59710b1888858cbe2f79c3163e3414fdde74ca3fad4509b1e8e42c72/merged major:0 minor:732 fsType:overlay blockSize:0} overlay_0-734:{mountpoint:/var/lib/containers/storage/overlay/aaf2746691323409b17620e01327a74f7053d4d66495445555310a9b22b44b85/merged major:0 minor:734 fsType:overlay blockSize:0} overlay_0-736:{mountpoint:/var/lib/containers/storage/overlay/ecc6966755542621f4d071f113acdb2397f107b5647895e53b72bfd3bace8ca7/merged major:0 minor:736 fsType:overlay blockSize:0} overlay_0-738:{mountpoint:/var/lib/containers/storage/overlay/15672f7c4f8d1791035b59642f6cdb8da6e12eab01570b15145e1214020d7ad5/merged major:0 minor:738 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/73f6273a1d890a54c254d8676ba0f545dcb36dea6451fba066efad8879df00d3/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-746:{mountpoint:/var/lib/containers/storage/overlay/d5e0bd218ee95b156ea82ae7e94096de199182c6fb6f587f2c9bff491ea47dc0/merged major:0 minor:746 fsType:overlay blockSize:0} overlay_0-767:{mountpoint:/var/lib/containers/storage/overlay/b0e26a495ce8cbf46ba4e712f2960b37c7bca2919b26de1eed706eb34c302e25/merged major:0 minor:767 fsType:overlay blockSize:0} overlay_0-768:{mountpoint:/var/lib/containers/storage/overlay/43d4354db87f8fb3179e0f233ef70322220240fbeec2f18b33287c1525f3159d/merged major:0 minor:768 fsType:overlay blockSize:0} overlay_0-770:{mountpoint:/var/lib/containers/storage/overlay/8130d8e2b184d304d5562e31341dd7f1b1c5418c86ae549c1f426cbfe3100eeb/merged major:0 minor:770 fsType:overlay blockSize:0} overlay_0-772:{mountpoint:/var/lib/containers/storage/overlay/4b4311a552d23c6f9a43fcc527624a4555718f5c7dee47d1da3eba73bc538bfa/merged major:0 minor:772 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/2d0fcbbda2ac046a3aaaeaab2e6dc1c9ee323615fdafbb14f33d717829a5ac16/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-782:{mountpoint:/var/lib/containers/storage/overlay/66187fc55aec6dbab1166016a73d4f42b977895be9d76220de80e4f66c5792b7/merged major:0 minor:782 fsType:overlay blockSize:0} overlay_0-785:{mountpoint:/var/lib/containers/storage/overlay/c4a17be7ce0de7e96462b6c765afdbd11091e326bdc389085113b905522af863/merged major:0 minor:785 fsType:overlay blockSize:0} overlay_0-797:{mountpoint:/var/lib/containers/storage/overlay/07233babf41501c3ffd0bedc94d6a9b35ec061f74a2024ea965c3893061ca760/merged major:0 minor:797 fsType:overlay blockSize:0} overlay_0-798:{mountpoint:/var/lib/containers/storage/overlay/209ec8a0bca649b0ae64187c27010687736fc37aa8218d1b11c885429af260ea/merged major:0 minor:798 fsType:overlay blockSize:0} overlay_0-800:{mountpoint:/var/lib/containers/storage/overlay/6d308e091616e19f8776a72510c951a0f487a263be8b6c9b7576243431da5c93/merged major:0 minor:800 fsType:overlay blockSize:0} overlay_0-802:{mountpoint:/var/lib/containers/storage/overlay/c69a95bfcbbc865a5ab2a67aab21973e32936d5661147a4bcec4534acbc0f1fc/merged major:0 minor:802 fsType:overlay blockSize:0} overlay_0-811:{mountpoint:/var/lib/containers/storage/overlay/27e7aa82511b29eec68c07e242c1d04673d6d9460e2380c26b40580ed5202f97/merged major:0 minor:811 fsType:overlay blockSize:0} overlay_0-828:{mountpoint:/var/lib/containers/storage/overlay/c3ab22c11e95ef6138fc76d8ffa550611b5ca7b1b85d3db48984a1cfe8686c45/merged major:0 minor:828 fsType:overlay blockSize:0} overlay_0-830:{mountpoint:/var/lib/containers/storage/overlay/0cdfc342328f595feae4b35e18180c97e7d1c6c10e143e2a68c977797079f851/merged major:0 minor:830 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/c1b3a195e93ffcb1d4166dd63a8ab7f3ea3045cbda4a416870b21e5eb83bcc94/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/6e7f5c9c51b3b551a218aa733617a441266c509bb0eab97a2944a593a9c96f38/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-840:{mountpoint:/var/lib/containers/storage/overlay/0e00271c3341950dd1aa918c4a5c54af6ac6aefbeebab2807a0689286c7306a7/merged major:0 minor:840 fsType:overlay blockSize:0} overlay_0-841:{mountpoint:/var/lib/containers/storage/overlay/8e0937c1d37d34562ddaee0f15bdf319c006cb43e283d7c3d00e54a5a1b75f25/merged major:0 minor:841 fsType:overlay blockSize:0} overlay_0-842:{mountpoint:/var/lib/containers/storage/overlay/96818fa681bf586671b0eea276a84aa26d8b896b95dd4f09e49dbdaee474bc71/merged major:0 minor:842 fsType:overlay blockSize:0} overlay_0-844:{mountpoint:/var/lib/containers/storage/overlay/a4bbcf2e1b5d3c03365044b0e25c7f20bf8cd3e4533f41def9426151e5b6c481/merged major:0 minor:844 fsType:overlay blockSize:0} overlay_0-846:{mountpoint:/var/lib/containers/storage/overlay/6847afc71592f8408a699bd086a297306fe42103fc7609ae4c5a6b533a9e3754/merged major:0 minor:846 fsType:overlay blockSize:0} overlay_0-850:{mountpoint:/var/lib/containers/storage/overlay/6839878bb5d162d8ade3c71a7c17a6999eafcb3b7ed2dfe33fe0ab1d29cf91fd/merged major:0 minor:850 fsType:overlay blockSize:0} overlay_0-852:{mountpoint:/var/lib/containers/storage/overlay/7c238ed08e01110572ace024f5bfe6158de935c0ef40bf8ec95f426cb2f1f2a5/merged major:0 minor:852 fsType:overlay blockSize:0} overlay_0-854:{mountpoint:/var/lib/containers/storage/overlay/a10451d229e52d63704cde1c29996e5acf47a57ed2368d2adccdddb8829d39a1/merged major:0 minor:854 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/70b7ce3d1e246ff90a6986439aace051e6f1fcb8d0f95bbd4db6f5733faf8689/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-876:{mountpoint:/var/lib/containers/storage/overlay/a04bae59d5177bc886cb8861ff7d28ab05bc8c6001a2e5bc77c5e659dab82db8/merged major:0 minor:876 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/01a4867360a0248c15ab26fef390e71f7699bca96ea8b5b6bf39d58089a5c601/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-888:{mountpoint:/var/lib/containers/storage/overlay/43ab4fcc428f403eaea5d1fac74eac98cd2ff7754b023f1de523c1e991332e13/merged major:0 minor:888 fsType:overlay blockSize:0} overlay_0-890:{mountpoint:/var/lib/containers/storage/overlay/54cb558806f6064d3982d35970f3d092b927d945cc1d674028480ed483f1907d/merged major:0 minor:890 fsType:overlay blockSize:0} overlay_0-894:{mountpoint:/var/lib/containers/storage/overlay/9f8824b1902f3e1fc49cb8514615d62db8c57d272c3bb23a77ae94790e21857c/merged major:0 minor:894 fsType:overlay blockSize:0} overlay_0-896:{mountpoint:/var/lib/containers/storage/overlay/9aa6f38fa043eb63df0fff340fc8cb50601d7c53e252bc6ca7648346fcb97ff7/merged major:0 minor:896 fsType:overlay blockSize:0} overlay_0-911:{mountpoint:/var/lib/containers/storage/overlay/29e22b78ff9f0939209de7585f91aebd0b8563962a4efd0f55ce88d3ad1ad5bf/merged major:0 minor:911 fsType:overlay blockSize:0} overlay_0-914:{mountpoint:/var/lib/containers/storage/overlay/e8f7474f369081aa0b8fb64a7d2c6d6128e7529f731e7dc3ac61f8a843e1e12c/merged major:0 minor:914 fsType:overlay blockSize:0} overlay_0-922:{mountpoint:/var/lib/containers/storage/overlay/f0120e46058bc7dcfc45fb1fad1b1a5bce09fa39c79de2763829691262e191e7/merged major:0 minor:922 fsType:overlay blockSize:0} overlay_0-925:{mountpoint:/var/lib/containers/storage/overlay/54a6802128f02c1c872cc2f3ca6a5d59700dec99a7c243be553d25f5fb227bde/merged major:0 minor:925 fsType:overlay blockSize:0} overlay_0-927:{mountpoint:/var/lib/containers/storage/overlay/9d1b8a521ef764b9c818574d0ad2588535ee4f6a6f147eded96344c538201bff/merged major:0 minor:927 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/1b47bc371823d9cee329402ac372909a8693671675d3ba108347898318a74ad7/merged major:0 minor:93 fsType:overlay blockSize:0} overlay_0-930:{mountpoint:/var/lib/containers/storage/overlay/d9a9219ffef3b79ed132e15b0d1c2f25068c0378bef93c13a2a77a137b8e190c/merged major:0 minor:930 fsType:overlay blockSize:0} overlay_0-933:{mountpoint:/var/lib/containers/storage/overlay/864f46155afbbf71f93541c6c317200b24b3404fcc4edecc1f4b990790898c19/merged major:0 minor:933 fsType:overlay blockSize:0} overlay_0-935:{mountpoint:/var/lib/containers/storage/overlay/eaacc663bae06fa65eb6731c937678025506ed14351826a7e9bd04395151e4b1/merged major:0 minor:935 fsType:overlay blockSize:0} overlay_0-936:{mountpoint:/var/lib/containers/storage/overlay/79581ae4433ace8cc57422c64980bb0247a470aa747fb2d9be4450bcd732ba93/merged major:0 minor:936 fsType:overlay blockSize:0} overlay_0-938:{mountpoint:/var/lib/containers/storage/overlay/d64fb641590999f53a2b2b330991710dea34c828af653898733ae5574931ce67/merged major:0 minor:938 fsType:overlay blockSize:0} overlay_0-948:{mountpoint:/var/lib/containers/storage/overlay/2b53e4cd70885b280db709b5af14be781c705adef614c4570194d4ff5883f464/merged major:0 minor:948 fsType:overlay blockSize:0} overlay_0-949:{mountpoint:/var/lib/containers/storage/overlay/734f3baf4c210afd78f69b62dba7b26ee260fe601341454e1df15686adfeb400/merged major:0 minor:949 fsType:overlay blockSize:0} overlay_0-952:{mountpoint:/var/lib/containers/storage/overlay/2c6cb0cbeda6969ef27ede316d3abc1d30d46a1de16fe1c52b7f22707ad97c64/merged major:0 minor:952 fsType:overlay blockSize:0} overlay_0-967:{mountpoint:/var/lib/containers/storage/overlay/5f513c47cb195ed2893f5a98073099085eb20b143936df9e02d878fc7ed49967/merged major:0 minor:967 fsType:overlay blockSize:0} overlay_0-975:{mountpoint:/var/lib/containers/storage/overlay/bf1dd8b9ba8fbe10b8e29a1a3ea5fb772e155accfa748a5625138bdf8e51639c/merged major:0 minor:975 fsType:overlay blockSize:0} overlay_0-985:{mountpoint:/var/lib/containers/storage/overlay/8e8d4fcffd08393e5f0a61d36d8d093dc9fce50651d4a4990c52e631a7735006/merged major:0 minor:985 fsType:overlay blockSize:0} overlay_0-992:{mountpoint:/var/lib/containers/storage/overlay/cbd38de7336c05ed35b380afe303ac999d676457c55639a56a01d85068e48eb0/merged major:0 minor:992 fsType:overlay blockSize:0} overlay_0-999:{mountpoint:/var/lib/containers/storage/overlay/7b7eda07f754f8656ad61f3f537ebadecf43bb753fdb74faf242171a336c8833/merged major:0 minor:999 fsType:overlay blockSize:0}] Dec 03 20:09:25.350204 master-0 kubenswrapper[29252]: I1203 20:09:25.348562 29252 manager.go:217] Machine: {Timestamp:2025-12-03 20:09:25.346845781 +0000 UTC m=+0.160390754 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:9870f3c6b33d40089e247d1fa3d9248c SystemUUID:9870f3c6-b33d-4008-9e24-7d1fa3d9248c BootID:2118df0c-6317-4582-908c-71a63e50558d Filesystems:[{Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~projected/kube-api-access-4c9qq DeviceMajor:0 DeviceMinor:170 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49efa7facfce8d50bf6399ae2e6f96a9a16dc5f311b520ce196c50d981643fd1/userdata/shm DeviceMajor:0 DeviceMinor:320 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:788 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1021 DeviceMajor:0 DeviceMinor:1021 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1677ae8793f1b3e61b335ded5b7ac95e63d604742bdba149b92ecb06281d760f/userdata/shm DeviceMajor:0 DeviceMinor:171 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-173 DeviceMajor:0 DeviceMinor:173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-611 DeviceMajor:0 DeviceMinor:611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~projected/kube-api-access-qzd2g DeviceMajor:0 DeviceMinor:654 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-203 DeviceMajor:0 DeviceMinor:203 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-453 DeviceMajor:0 DeviceMinor:453 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-841 DeviceMajor:0 DeviceMinor:841 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:545 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:963 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-167 DeviceMajor:0 DeviceMinor:167 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-362 DeviceMajor:0 DeviceMinor:362 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-646 DeviceMajor:0 DeviceMinor:646 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-440 DeviceMajor:0 DeviceMinor:440 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~projected/kube-api-access-lvklf DeviceMajor:0 DeviceMinor:972 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-133 DeviceMajor:0 DeviceMinor:133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1025 DeviceMajor:0 DeviceMinor:1025 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a336a885dee021602a811c06c05965d3ceafbc2a4e4dc7061efbb563491832b7/userdata/shm DeviceMajor:0 DeviceMinor:324 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-613 DeviceMajor:0 DeviceMinor:613 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/42bc79c7b9ffa15ca475a4edc477b358626509600367cbde78e61fb4d3277efb/userdata/shm DeviceMajor:0 DeviceMinor:892 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-397 DeviceMajor:0 DeviceMinor:397 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4d8dcd686b7f438d91027e16be00d386ed8e811dad59ae3d10143a981ef3034/userdata/shm DeviceMajor:0 DeviceMinor:543 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-155 DeviceMajor:0 DeviceMinor:155 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9/volumes/kubernetes.io~projected/kube-api-access-grk2s DeviceMajor:0 DeviceMinor:500 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:673 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-696 DeviceMajor:0 DeviceMinor:696 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1004 DeviceMajor:0 DeviceMinor:1004 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-606 DeviceMajor:0 DeviceMinor:606 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/01d51d9a-9beb-4357-9dc2-aeac210cd0c4/volumes/kubernetes.io~projected/kube-api-access-6sqtm DeviceMajor:0 DeviceMinor:323 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-584 DeviceMajor:0 DeviceMinor:584 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-811 DeviceMajor:0 DeviceMinor:811 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-391 DeviceMajor:0 DeviceMinor:391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a710102c-72fb-4d8d-ad99-71940368a09e/volumes/kubernetes.io~projected/kube-api-access-zgmkc DeviceMajor:0 DeviceMinor:887 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/048697d9a6342582c8b3059ecb9a0cfe7c0a764a192a00f0ded82f4081cc7252/userdata/shm DeviceMajor:0 DeviceMinor:136 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/11e2c94f-f9e9-415b-a550-3006a4632ba4/volumes/kubernetes.io~projected/kube-api-access-pfqnq DeviceMajor:0 DeviceMinor:319 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:989 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-930 DeviceMajor:0 DeviceMinor:930 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-127 DeviceMajor:0 DeviceMinor:127 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-854 DeviceMajor:0 DeviceMinor:854 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-785 DeviceMajor:0 DeviceMinor:785 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~projected/kube-api-access-7qrgh DeviceMajor:0 DeviceMinor:336 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1277 DeviceMajor:0 DeviceMinor:1277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1369 DeviceMajor:0 DeviceMinor:1369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dbe65295e2c898be586dca5d88680f9b16d8f0721a6e9ed04f2477053779cf26/userdata/shm DeviceMajor:0 DeviceMinor:576 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-846 DeviceMajor:0 DeviceMinor:846 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:957 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1111 DeviceMajor:0 DeviceMinor:1111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/94345aca0ffddac869becc835a6c22d571aca8cdc67c8d1a0844b640b65b6099/userdata/shm DeviceMajor:0 DeviceMinor:961 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1047 DeviceMajor:0 DeviceMinor:1047 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-437 DeviceMajor:0 DeviceMinor:437 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-732 DeviceMajor:0 DeviceMinor:732 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~projected/kube-api-access-59d2r DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:665 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:727 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-598 DeviceMajor:0 DeviceMinor:598 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-933 DeviceMajor:0 DeviceMinor:933 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1202 DeviceMajor:0 DeviceMinor:1202 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1409 DeviceMajor:0 DeviceMinor:1409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/367c2c7c-1fc8-4608-aa94-b64c6c70cc61/volumes/kubernetes.io~projected/kube-api-access-hb5j7 DeviceMajor:0 DeviceMinor:487 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:546 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c0a70d9d0d86d6f719dfcdce57dead2d1b8eec5a2b0f03bea14ce004f4ee91ea/userdata/shm DeviceMajor:0 DeviceMinor:679 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:608 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1068 DeviceMajor:0 DeviceMinor:1068 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-431 DeviceMajor:0 DeviceMinor:431 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-828 DeviceMajor:0 DeviceMinor:828 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-999 DeviceMajor:0 DeviceMinor:999 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:726 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1040 DeviceMajor:0 DeviceMinor:1040 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1268 DeviceMajor:0 DeviceMinor:1268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef04edbf93893169f2ce0656a624fe737e2b430675591752e41e98b545e6bf40/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-706 DeviceMajor:0 DeviceMinor:706 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-985 DeviceMajor:0 DeviceMinor:985 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1023 DeviceMajor:0 DeviceMinor:1023 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~projected/kube-api-access-w7nkb DeviceMajor:0 DeviceMinor:119 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bb807fb004e1c5a8c12ce908fa4f2effefa5e62f25142bb2fe3ec8dd74d140f1/userdata/shm DeviceMajor:0 DeviceMinor:108 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-768 DeviceMajor:0 DeviceMinor:768 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-213 DeviceMajor:0 DeviceMinor:213 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-669 DeviceMajor:0 DeviceMinor:669 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~projected/kube-api-access-4wcmd DeviceMajor:0 DeviceMinor:953 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b0f89725c2a6c3514238a4cc365a81c3b56d37ffea32d9d0a2e9a1e91fecf2fb/userdata/shm DeviceMajor:0 DeviceMinor:629 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd040c8de744a713ee80a954f75065a2b691638426b8496773ad0910f9875316/userdata/shm DeviceMajor:0 DeviceMinor:122 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-876 DeviceMajor:0 DeviceMinor:876 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:1019 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1101 DeviceMajor:0 DeviceMinor:1101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1226 DeviceMajor:0 DeviceMinor:1226 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-401 DeviceMajor:0 DeviceMinor:401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-596 DeviceMajor:0 DeviceMinor:596 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a/userdata/shm DeviceMajor:0 DeviceMinor:522 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/988c74f72d6d3987e23eadc15e10a46097f9412b88f2d407e398a913b05fa016/userdata/shm DeviceMajor:0 DeviceMinor:488 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f/volumes/kubernetes.io~projected/kube-api-access-vvlxr DeviceMajor:0 DeviceMinor:455 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-844 DeviceMajor:0 DeviceMinor:844 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1049 DeviceMajor:0 DeviceMinor:1049 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~projected/kube-api-access-qhg82 DeviceMajor:0 DeviceMinor:185 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-667 DeviceMajor:0 DeviceMinor:667 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-800 DeviceMajor:0 DeviceMinor:800 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-537 DeviceMajor:0 DeviceMinor:537 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-225 DeviceMajor:0 DeviceMinor:225 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-938 DeviceMajor:0 DeviceMinor:938 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-936 DeviceMajor:0 DeviceMinor:936 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-725 DeviceMajor:0 DeviceMinor:725 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/08c01ca5f1fe5f2ef9cd1ac17b729f8e737e95206dcb86f9ce9c09225b746a55/userdata/shm DeviceMajor:0 DeviceMinor:342 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8b5478f55322c86d9620262432fda124f2df1ae79e09d51d64ffbf6929820091/userdata/shm DeviceMajor:0 DeviceMinor:358 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-413 DeviceMajor:0 DeviceMinor:413 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-464 DeviceMajor:0 DeviceMinor:464 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-767 DeviceMajor:0 DeviceMinor:767 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-153 DeviceMajor:0 DeviceMinor:153 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-797 DeviceMajor:0 DeviceMinor:797 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-850 DeviceMajor:0 DeviceMinor:850 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1343 DeviceMajor:0 DeviceMinor:1343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6bb19329-c50c-4214-94c8-7e8771b99233/volumes/kubernetes.io~projected/kube-api-access-kszjr DeviceMajor:0 DeviceMinor:923 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-384 DeviceMajor:0 DeviceMinor:384 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-407 DeviceMajor:0 DeviceMinor:407 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-842 DeviceMajor:0 DeviceMinor:842 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:656 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-935 DeviceMajor:0 DeviceMinor:935 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-507 DeviceMajor:0 DeviceMinor:507 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-592 DeviceMajor:0 DeviceMinor:592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7311eb8e0cfeb885addad4bf6c0ceae3553a0417b770ce4938a40cee85fb2dfd/userdata/shm DeviceMajor:0 DeviceMinor:1017 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/a185ee17-4b4b-4d20-a8ed-56a2a01f1807/volumes/kubernetes.io~projected/kube-api-access-sxqph DeviceMajor:0 DeviceMinor:302 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:791 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-896 DeviceMajor:0 DeviceMinor:896 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-602 DeviceMajor:0 DeviceMinor:602 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ae6841c89bd0bc9cfc6015de7cc1e3a4bbed5c62b59fd91032790f9ed1aaac0/userdata/shm DeviceMajor:0 DeviceMinor:456 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:671 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1316 DeviceMajor:0 DeviceMinor:1316 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-548 DeviceMajor:0 DeviceMinor:548 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-143 DeviceMajor:0 DeviceMinor:143 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1388 DeviceMajor:0 DeviceMinor:1388 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-211 DeviceMajor:0 DeviceMinor:211 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-109 DeviceMajor:0 DeviceMinor:109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:510 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-258 DeviceMajor:0 DeviceMinor:258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-564 DeviceMajor:0 DeviceMinor:564 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~projected/kube-api-access-lkhn4 DeviceMajor:0 DeviceMinor:796 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-234 DeviceMajor:0 DeviceMinor:234 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~projected/kube-api-access-5sdw4 DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:973 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-708 DeviceMajor:0 DeviceMinor:708 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-712 DeviceMajor:0 DeviceMinor:712 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38072447ae412858938614108f0275e0c66bb65d93f888cc2667f73663ae0790/userdata/shm DeviceMajor:0 DeviceMinor:346 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-888 DeviceMajor:0 DeviceMinor:888 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1030 DeviceMajor:0 DeviceMinor:1030 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1358 DeviceMajor:0 DeviceMinor:1358 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b638f207-31df-4298-8801-4da6031deefc/volumes/kubernetes.io~projected/kube-api-access-trv6b DeviceMajor:0 DeviceMinor:553 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:674 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:676 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-700 DeviceMajor:0 DeviceMinor:700 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c/userdata/shm DeviceMajor:0 DeviceMinor:1260 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1353 DeviceMajor:0 DeviceMinor:1353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-563 DeviceMajor:0 DeviceMinor:563 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1394 DeviceMajor:0 DeviceMinor:1394 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dbc9d9f3c90ebc5bbbfe36c2028e07277634315bcc3781675056eb652072f16a/userdata/shm DeviceMajor:0 DeviceMinor:591 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-798 DeviceMajor:0 DeviceMinor:798 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1088 DeviceMajor:0 DeviceMinor:1088 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-992 DeviceMajor:0 DeviceMinor:992 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2/volumes/kubernetes.io~projected/kube-api-access-7bdn5 DeviceMajor:0 DeviceMinor:528 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-226 DeviceMajor:0 DeviceMinor:226 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-469 DeviceMajor:0 DeviceMinor:469 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-242 DeviceMajor:0 DeviceMinor:242 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-704 DeviceMajor:0 DeviceMinor:704 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes/kubernetes.io~projected/kube-api-access-p7vxl DeviceMajor:0 DeviceMinor:511 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1173 DeviceMajor:0 DeviceMinor:1173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~projected/kube-api-access-ljsr6 DeviceMajor:0 DeviceMinor:1315 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-890 DeviceMajor:0 DeviceMinor:890 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/78a864f2-934f-4197-9753-24c9bc7f1fca/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~projected/kube-api-access-x66sr DeviceMajor:0 DeviceMinor:499 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d6be581fee143ab495ef9288ca56547cef2f234318e097d914e85b3da00c3425/userdata/shm DeviceMajor:0 DeviceMinor:590 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:878 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-423 DeviceMajor:0 DeviceMinor:423 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-922 DeviceMajor:0 DeviceMinor:922 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28a18281ec372fa67100c899a3d3b1ddbaac78df588b0cd751eb6a61fdd46f87/userdata/shm DeviceMajor:0 DeviceMinor:1029 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-379 DeviceMajor:0 DeviceMinor:379 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:965 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1348 DeviceMajor:0 DeviceMinor:1348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2/volumes/kubernetes.io~projected/kube-api-access-dl5h7 DeviceMajor:0 DeviceMinor:966 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1082 DeviceMajor:0 DeviceMinor:1082 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/daa8efc0-4514-4a14-80f5-ab9eca53a127/volumes/kubernetes.io~projected/kube-api-access-rbsx8 DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/15fa7ece9624e476a927666dc492b7bd2df94f7942d686ce643ec390d690ecca/userdata/shm DeviceMajor:0 DeviceMinor:387 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/371917da-b783-4acc-81af-1cfc903269f4/volumes/kubernetes.io~projected/kube-api-access-w4v7k DeviceMajor:0 DeviceMinor:341 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1065 DeviceMajor:0 DeviceMinor:1065 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-782 DeviceMajor:0 DeviceMinor:782 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~projected/kube-api-access-s2c85 DeviceMajor:0 DeviceMinor:146 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-271 DeviceMajor:0 DeviceMinor:271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/52e10ffcc1fbdf8f2cb9d16e424d95ecef32b76b41b9a925005182a3b5446923/userdata/shm DeviceMajor:0 DeviceMinor:577 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1038 DeviceMajor:0 DeviceMinor:1038 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1252 DeviceMajor:0 DeviceMinor:1252 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-468 DeviceMajor:0 DeviceMinor:468 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-746 DeviceMajor:0 DeviceMinor:746 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c8fa62db9ae1d5afc07c786415f97448d1baeaca29acf6f92b49c7da920421a7/userdata/shm DeviceMajor:0 DeviceMinor:317 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-354 DeviceMajor:0 DeviceMinor:354 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-498 DeviceMajor:0 DeviceMinor:498 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-555 DeviceMajor:0 DeviceMinor:555 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75ad2809d96a1369619e26966fceb45e6c13fc754c6dc35b21749d37ba20ab2a/userdata/shm DeviceMajor:0 DeviceMinor:689 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-451 DeviceMajor:0 DeviceMinor:451 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e4f88-7106-4a46-8b63-053345922fb0/volumes/kubernetes.io~projected/kube-api-access-crfnp DeviceMajor:0 DeviceMinor:286 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b84835e3-e8bc-4aa4-a8f3-f9be702a358a/volumes/kubernetes.io~projected/kube-api-access-vtwbs DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0d31ad42fdaaa8d9f4506f72df0676530f77957571a46716dc1e834dfef43d2c/userdata/shm DeviceMajor:0 DeviceMinor:505 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-702 DeviceMajor:0 DeviceMinor:702 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-124 DeviceMajor:0 DeviceMinor:124 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~projected/kube-api-access-mr8x9 DeviceMajor:0 DeviceMinor:340 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75ad395a74500e699e0114a02b486d58badb2f6e46a9b16d69b6836ed61de9f2/userdata/shm DeviceMajor:0 DeviceMinor:1007 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:918 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:444 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ad22d8ed-2476-441b-aa3b-a7845606b0ac/volumes/kubernetes.io~projected/kube-api-access-xjn9m DeviceMajor:0 DeviceMinor:987 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b9062d56a3074fcc3f3a4a8ecee0d9736b5e9e6f4c5eef18fa307a87652c36a3/userdata/shm DeviceMajor:0 DeviceMinor:1002 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b7019b680708a6b0cc34565d068ec422e5cf82d6c1379cc668471d678f72f33d/userdata/shm DeviceMajor:0 DeviceMinor:163 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:795 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b9c573f7ba19dc3323c14093fb10a43f5d1d1f19bc23f8da28f974d65efe3f1/userdata/shm DeviceMajor:0 DeviceMinor:594 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~projected/kube-api-access-7cnmn DeviceMajor:0 DeviceMinor:367 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-372 DeviceMajor:0 DeviceMinor:372 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1367 DeviceMajor:0 DeviceMinor:1367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-114 DeviceMajor:0 DeviceMinor:114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-344 DeviceMajor:0 DeviceMinor:344 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1224 DeviceMajor:0 DeviceMinor:1224 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:580 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3cff086346a7c5e3777cf149e0e1d8f97d1a0c5b1f9e52848dc132dcdccf253d/userdata/shm DeviceMajor:0 DeviceMinor:982 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98/userdata/shm DeviceMajor:0 DeviceMinor:116 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1391 DeviceMajor:0 DeviceMinor:1391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:333 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-830 DeviceMajor:0 DeviceMinor:830 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cd35fc5f-07ab-4c66-9b80-33a598d417ef/volumes/kubernetes.io~projected/kube-api-access-qk5wb DeviceMajor:0 DeviceMinor:958 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:655 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~projected/kube-api-access-cv24n DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1214 DeviceMajor:0 DeviceMinor:1214 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-840 DeviceMajor:0 DeviceMinor:840 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-405 DeviceMajor:0 DeviceMinor:405 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~projected/kube-api-access-k7t26 DeviceMajor:0 DeviceMinor:990 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-975 DeviceMajor:0 DeviceMinor:975 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-403 DeviceMajor:0 DeviceMinor:403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d1fa423033e36d1ea0fc496cea36bc32452af1d97e8f92ea7f243855f38360a2/userdata/shm DeviceMajor:0 DeviceMinor:1008 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/943feb0d-7d31-446a-9100-dfc4ef013d12/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/06784de650baea189078ed954c09cf1adab506ac7eaeb2563708127435863bfd/userdata/shm DeviceMajor:0 DeviceMinor:924 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0c22de28b514bd9de5323a780b66baaf0574a8898405da26c3c85130d1ec1ce9/userdata/shm DeviceMajor:0 DeviceMinor:315 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:664 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:368 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-427 DeviceMajor:0 DeviceMinor:427 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-736 DeviceMajor:0 DeviceMinor:736 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/af2023e1-9c7a-40af-a6bf-fba31c3565b1/volumes/kubernetes.io~projected/kube-api-access-hdd6z DeviceMajor:0 DeviceMinor:974 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cd08c33a38d123c20d17a144cb73cdc913867f657f3ed47969c25f2ac5811c9/userdata/shm DeviceMajor:0 DeviceMinor:981 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/90610a53-b590-491e-8014-f0704afdc6e1/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:900 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/d5f33153-bff1-403f-ae17-b7e90500365d/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ae7d38a01611431f5b1b916ad750e69ffaaacefc6c9b10d1dad35bf2f9161d22/userdata/shm DeviceMajor:0 DeviceMinor:678 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b/volumes/kubernetes.io~projected/kube-api-access-c5dpx DeviceMajor:0 DeviceMinor:369 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1263 DeviceMajor:0 DeviceMinor:1263 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/73b7027e-44f5-4c7b-9226-585a90530535/volumes/kubernetes.io~projected/kube-api-access-7pf5q DeviceMajor:0 DeviceMinor:789 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-925 DeviceMajor:0 DeviceMinor:925 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/volumes/kubernetes.io~projected/kube-api-access-457ln DeviceMajor:0 DeviceMinor:314 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d7171597-cb9a-451c-80a4-64cfccf885f0/volumes/kubernetes.io~projected/kube-api-access-gs8fx DeviceMajor:0 DeviceMinor:547 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0c718666bc5fa03621c39d9ebf94cf18c64cdbdf19cd3de5b727dc2e38eb8ea5/userdata/shm DeviceMajor:0 DeviceMinor:730 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-952 DeviceMajor:0 DeviceMinor:952 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-181 DeviceMajor:0 DeviceMinor:181 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Devic Dec 03 20:09:25.351417 master-0 kubenswrapper[29252]: e:overlay_0-1109 DeviceMajor:0 DeviceMinor:1109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1064 DeviceMajor:0 DeviceMinor:1064 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-147 DeviceMajor:0 DeviceMinor:147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:657 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-914 DeviceMajor:0 DeviceMinor:914 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1043 DeviceMajor:0 DeviceMinor:1043 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-770 DeviceMajor:0 DeviceMinor:770 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/87f1759a-7df4-442e-a22d-6de8d54be333/volumes/kubernetes.io~projected/kube-api-access-wvllg DeviceMajor:0 DeviceMinor:135 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:322 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/db6143edbd1b68cfe8bbe553ee3ca87d799ea0e63aff48d4d038dfa43496204a/userdata/shm DeviceMajor:0 DeviceMinor:382 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9ae1ee41d37f4b0f2aff315d3bc5733756252272483e29c7d8046c2d96630d79/userdata/shm DeviceMajor:0 DeviceMinor:501 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-644 DeviceMajor:0 DeviceMinor:644 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e2387dbfcc1d429cb65e949d260da12685f9167ab5d7e2e2846349bd7d4f915e/userdata/shm DeviceMajor:0 DeviceMinor:680 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c3afc439-ccaa-4751-95a1-ac7557e326f0/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1314 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-582 DeviceMajor:0 DeviceMinor:582 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1054 DeviceMajor:0 DeviceMinor:1054 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-911 DeviceMajor:0 DeviceMinor:911 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:161 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/128ed384-7ab6-41b6-bf45-c8fda917d52f/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:574 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:366 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d0d3daec6476579642facd81cb6257eb10f7c617299056e3757e4a0c79c948a4/userdata/shm DeviceMajor:0 DeviceMinor:121 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:672 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d196dca7-f940-4aa0-b20a-214d22b62db6/volumes/kubernetes.io~projected/kube-api-access-tphq2 DeviceMajor:0 DeviceMinor:692 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1175 DeviceMajor:0 DeviceMinor:1175 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/volumes/kubernetes.io~projected/kube-api-access-qdhcd DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:988 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/volumes/kubernetes.io~projected/kube-api-access-6bhk4 DeviceMajor:0 DeviceMinor:303 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/63e3d36d-1676-4f90-ac9a-d85b861a4655/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:495 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5decce88-c71e-411c-87b5-a37dd0f77e7b/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:575 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~projected/kube-api-access-95zsj DeviceMajor:0 DeviceMinor:792 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-967 DeviceMajor:0 DeviceMinor:967 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e5b6b4913ad9a7e9beefb1308e65939d7d65885f92832939f4bd387eda50473/userdata/shm DeviceMajor:0 DeviceMinor:1365 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e6304ea619f0b996e6dede6cd4e07910aa977eac4013d0444808ca8298842f22/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a60557e4a853b254e1a52367430f6552fb59c31039de6af8378df26f94038fb/userdata/shm DeviceMajor:0 DeviceMinor:685 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1001 DeviceMajor:0 DeviceMinor:1001 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f6cfc0641f7e192cbb940115d2ba3add0762b14146ea756523e733a04332e0a9/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/09f5df5c-fd9b-430d-aecc-242594b4aff1/volumes/kubernetes.io~projected/kube-api-access-twlw5 DeviceMajor:0 DeviceMinor:960 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ec90b46e5817f62e5cb3d92e8419aeaaa1a2c0a9eebd84f2c7545dcfdabcf365/userdata/shm DeviceMajor:0 DeviceMinor:331 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1176 DeviceMajor:0 DeviceMinor:1176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1080 DeviceMajor:0 DeviceMinor:1080 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-927 DeviceMajor:0 DeviceMinor:927 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4b28ab1c5b84f1f69a50cfae68a166f61b8e5091b37338c90666da83b930b13/userdata/shm DeviceMajor:0 DeviceMinor:683 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b2021db5-b27a-4e06-beec-d9ba82aa1ffc/volumes/kubernetes.io~projected/kube-api-access-j6skg DeviceMajor:0 DeviceMinor:970 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788/userdata/shm DeviceMajor:0 DeviceMinor:954 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d210062f-c07e-419f-a551-c37571565686/volumes/kubernetes.io~projected/kube-api-access-v7xk9 DeviceMajor:0 DeviceMinor:162 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a19b8f9e-6299-43bf-9aa5-22071b855773/volumes/kubernetes.io~projected/kube-api-access-6ghnf DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46b5d4d0-b841-4e87-84b4-85911ff04325/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:675 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-949 DeviceMajor:0 DeviceMinor:949 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1242 DeviceMajor:0 DeviceMinor:1242 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-389 DeviceMajor:0 DeviceMinor:389 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1378 DeviceMajor:0 DeviceMinor:1378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b7caf673c76ae18dcf4a0dfc42dc02071d5031c44976dc3b0bf55ef4e26083bf/userdata/shm DeviceMajor:0 DeviceMinor:587 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-802 DeviceMajor:0 DeviceMinor:802 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c4d45235-fb1a-4626-a41e-b1e34f7bf76e/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:186 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2f618ea7-3ad7-4dce-b450-a8202285f312/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:169 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c593a75e-c2af-4419-94da-e0c9ff14c41f/volumes/kubernetes.io~projected/kube-api-access-j2xcx DeviceMajor:0 DeviceMinor:666 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-525 DeviceMajor:0 DeviceMinor:525 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1265 DeviceMajor:0 DeviceMinor:1265 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ed25861-1328-45e7-922e-37588a0b019c/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:442 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6eb4700c-6af0-468b-afc8-1e09b902d6bf/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:77 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6a5af31c4c1e2f84958d04c9531001f07d3ef520fdf16d375a2d25f61196cfa7/userdata/shm DeviceMajor:0 DeviceMinor:694 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-698 DeviceMajor:0 DeviceMinor:698 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-948 DeviceMajor:0 DeviceMinor:948 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b673cb04-f6f0-4113-bdcd-d6685b942c9f/volumes/kubernetes.io~projected/kube-api-access-m2qch DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/85f7ddcd30f09f1a0fda67d2dbaf1344d49e468b4e45601d31e0dfb9ac188ad5/userdata/shm DeviceMajor:0 DeviceMinor:1046 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7bfd19dcf77f81a2da47b10628f23027c2e3ee7dbe77cc6ea6e50ab79c6df0a9/userdata/shm DeviceMajor:0 DeviceMinor:81 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1335 DeviceMajor:0 DeviceMinor:1335 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5bee7d8031a36fa09960f186184717b2ac09e44e86995d183c886a9ab1dcdca8/userdata/shm DeviceMajor:0 DeviceMinor:977 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/volumes/kubernetes.io~projected/kube-api-access-bztz2 DeviceMajor:0 DeviceMinor:118 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba68608f-6b36-455e-b80b-d19237df9312/volumes/kubernetes.io~projected/kube-api-access-855t4 DeviceMajor:0 DeviceMinor:381 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e59b58099dad5ea5ed0fd3c1716f8fb9f04f32f368cb6e0afc9cede661e06a70/userdata/shm DeviceMajor:0 DeviceMinor:586 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-151 DeviceMajor:0 DeviceMinor:151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-360 DeviceMajor:0 DeviceMinor:360 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/06b5799dce9659bf0df409ce3b2524ff568aaba7fc6e7ca8b83098be9071ffc9/userdata/shm DeviceMajor:0 DeviceMinor:728 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f749c7f2-1fd7-4078-a92d-0ae5523998ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:971 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4885c85229f1632ce115036d60c7a6767b9efe2b85e96fadba3614a99fdc575/userdata/shm DeviceMajor:0 DeviceMinor:326 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-478 DeviceMajor:0 DeviceMinor:478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-493 DeviceMajor:0 DeviceMinor:493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-772 DeviceMajor:0 DeviceMinor:772 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1103 DeviceMajor:0 DeviceMinor:1103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1400 DeviceMajor:0 DeviceMinor:1400 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-738 DeviceMajor:0 DeviceMinor:738 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a1b2034d20b8550395063b65a0de0eddb16cb0c3a6fde052b4127e400052376/userdata/shm DeviceMajor:0 DeviceMinor:1006 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/433c3273-c99e-4d68-befc-06f92d2fc8d5/volumes/kubernetes.io~projected/kube-api-access-xwcj7 DeviceMajor:0 DeviceMinor:969 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b8709c6c-8729-4702-a3fb-35a072855096/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:879 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75f2d2ce983b4d5090010050d78ba28c8452643f80661c230a1cbdc90a216214/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:793 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6404bbc7-8ca9-4f20-8ce7-40f855555160/volumes/kubernetes.io~projected/kube-api-access-4d468 DeviceMajor:0 DeviceMinor:964 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f9f99422-7991-40ef-92a1-de2e603e47b9/volumes/kubernetes.io~projected/kube-api-access-pk4z4 DeviceMajor:0 DeviceMinor:330 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f96c70ce-314a-4919-91e9-cc776a620846/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:794 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-624 DeviceMajor:0 DeviceMinor:624 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-496 DeviceMajor:0 DeviceMinor:496 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1328 DeviceMajor:0 DeviceMinor:1328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-554 DeviceMajor:0 DeviceMinor:554 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1297 DeviceMajor:0 DeviceMinor:1297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f82c7a1-ec21-497d-86f2-562cafa7ace7/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:790 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/830d89af-1266-43ac-b113-990a28595f91/volumes/kubernetes.io~projected/kube-api-access-lkhcw DeviceMajor:0 DeviceMinor:386 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-852 DeviceMajor:0 DeviceMinor:852 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b3ee9a2-0f17-4a04-9191-b60684ef6c29/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cf57a007c4e3680497adee52392a99d33552f24788c0574cbafbc31f9dc73f4/userdata/shm DeviceMajor:0 DeviceMinor:1010 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-471 DeviceMajor:0 DeviceMinor:471 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-734 DeviceMajor:0 DeviceMinor:734 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-894 DeviceMajor:0 DeviceMinor:894 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:06784de650baea1 MacAddress:de:7e:66:6e:f4:b7 Speed:10000 Mtu:8900} {Name:06b5799dce9659b MacAddress:66:0b:13:8f:02:b2 Speed:10000 Mtu:8900} {Name:08c01ca5f1fe5f2 MacAddress:da:fa:76:4c:74:41 Speed:10000 Mtu:8900} {Name:0b9c573f7ba19dc MacAddress:16:d4:8e:a6:fe:22 Speed:10000 Mtu:8900} {Name:0c22de28b514bd9 MacAddress:4e:14:52:68:30:38 Speed:10000 Mtu:8900} {Name:0c718666bc5fa03 MacAddress:aa:78:a3:32:8e:cd Speed:10000 Mtu:8900} {Name:0d31ad42fdaaa8d MacAddress:72:58:dc:e1:f5:1a Speed:10000 Mtu:8900} {Name:15fa7ece9624e47 MacAddress:c6:2b:13:99:74:29 Speed:10000 Mtu:8900} {Name:1cd08c33a38d123 MacAddress:c6:e0:f0:f0:b3:ae Speed:10000 Mtu:8900} {Name:1cf57a007c4e368 MacAddress:fe:a4:6d:bf:7d:1a Speed:10000 Mtu:8900} {Name:28a18281ec372fa MacAddress:66:72:66:77:1f:a4 Speed:10000 Mtu:8900} {Name:2ae6841c89bd0bc MacAddress:5a:4e:6f:56:ca:11 Speed:10000 Mtu:8900} {Name:2e5b6b4913ad9a7 MacAddress:22:df:2f:eb:40:22 Speed:10000 Mtu:8900} {Name:38072447ae41285 MacAddress:92:a5:a1:be:a6:9f Speed:10000 Mtu:8900} {Name:3cff086346a7c5e MacAddress:86:c5:a8:67:7e:0b Speed:10000 Mtu:8900} {Name:49efa7facfce8d5 MacAddress:3e:bb:1a:68:9e:3e Speed:10000 Mtu:8900} {Name:4a1b2034d20b855 MacAddress:e6:89:84:ad:f2:2c Speed:10000 Mtu:8900} {Name:52e10ffcc1fbdf8 MacAddress:ee:da:53:3f:ce:9b Speed:10000 Mtu:8900} {Name:5bee7d8031a36fa MacAddress:56:6e:a5:96:f3:0b Speed:10000 Mtu:8900} {Name:7311eb8e0cfeb88 MacAddress:72:1f:22:eb:2a:e9 Speed:10000 Mtu:8900} {Name:75ad2809d96a136 MacAddress:c6:a8:3b:e2:e8:00 Speed:10000 Mtu:8900} {Name:75ad395a74500e6 MacAddress:8e:4f:85:9b:81:aa Speed:10000 Mtu:8900} {Name:85f7ddcd30f09f1 MacAddress:7e:62:50:65:84:46 Speed:10000 Mtu:8900} {Name:988c74f72d6d398 MacAddress:aa:83:55:b9:ed:6f Speed:10000 Mtu:8900} {Name:9a60557e4a853b2 MacAddress:62:2a:bd:86:1e:3d Speed:10000 Mtu:8900} {Name:9ae1ee41d37f4b0 MacAddress:36:a6:86:5d:d9:75 Speed:10000 Mtu:8900} {Name:a336a885dee0216 MacAddress:d6:3f:31:49:e0:b3 Speed:10000 Mtu:8900} {Name:a57372b0142961f MacAddress:7a:c8:5b:4a:b5:85 Speed:10000 Mtu:8900} {Name:ae7d38a01611431 MacAddress:06:9a:23:0e:b7:f7 Speed:10000 Mtu:8900} {Name:b0f89725c2a6c35 MacAddress:0a:d5:9d:f0:f2:ce Speed:10000 Mtu:8900} {Name:b4885c85229f163 MacAddress:4e:9e:9a:21:1e:13 Speed:10000 Mtu:8900} {Name:b4d8dcd686b7f43 MacAddress:02:88:5c:1a:60:28 Speed:10000 Mtu:8900} {Name:b7caf673c76ae18 MacAddress:4a:0b:c8:b1:fd:95 Speed:10000 Mtu:8900} {Name:b9062d56a3074fc MacAddress:a2:03:7d:07:d7:6a Speed:10000 Mtu:8900} {Name:bb807fb004e1c5a MacAddress:c6:ef:0d:92:af:d5 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:2a:36:a0:22:39:c7 Speed:0 Mtu:8900} {Name:c0a70d9d0d86d6f MacAddress:16:81:96:1e:e9:d6 Speed:10000 Mtu:8900} {Name:c8fa62db9ae1d5a MacAddress:fe:0b:61:d6:83:1d Speed:10000 Mtu:8900} {Name:d4b28ab1c5b84f1 MacAddress:0a:a7:6e:60:3d:60 Speed:10000 Mtu:8900} {Name:d6be581fee143ab MacAddress:8a:9b:07:d2:b7:88 Speed:10000 Mtu:8900} {Name:db6143edbd1b68c MacAddress:1e:0e:41:c4:03:10 Speed:10000 Mtu:8900} {Name:dbc9d9f3c90ebc5 MacAddress:12:dd:4c:18:ea:88 Speed:10000 Mtu:8900} {Name:dbe65295e2c898b MacAddress:2e:79:d6:e4:f5:5b Speed:10000 Mtu:8900} {Name:e2387dbfcc1d429 MacAddress:e2:47:67:e8:d3:cd Speed:10000 Mtu:8900} {Name:e59b58099dad5ea MacAddress:7e:df:31:e6:27:e3 Speed:10000 Mtu:8900} {Name:e6304ea619f0b99 MacAddress:2e:85:23:56:cd:15 Speed:10000 Mtu:8900} {Name:ec90b46e5817f62 MacAddress:2a:dd:4f:70:1d:d2 Speed:10000 Mtu:8900} {Name:ef04edbf9389316 MacAddress:ca:72:6e:0e:e0:f4 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c1:91:ba Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:45:dc:6d Speed:-1 Mtu:9000} {Name:f6cfc0641f7e192 MacAddress:fe:58:40:49:74:41 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:1a:18:db:b8:db:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 20:09:25.351417 master-0 kubenswrapper[29252]: I1203 20:09:25.350206 29252 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 20:09:25.351417 master-0 kubenswrapper[29252]: I1203 20:09:25.350281 29252 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 20:09:25.351417 master-0 kubenswrapper[29252]: I1203 20:09:25.351351 29252 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351562 29252 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351614 29252 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351924 29252 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351935 29252 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351944 29252 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 20:09:25.351983 master-0 kubenswrapper[29252]: I1203 20:09:25.351965 29252 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 20:09:25.352161 master-0 kubenswrapper[29252]: I1203 20:09:25.352012 29252 state_mem.go:36] "Initialized new in-memory state store" Dec 03 20:09:25.352161 master-0 kubenswrapper[29252]: I1203 20:09:25.352092 29252 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 20:09:25.352215 master-0 kubenswrapper[29252]: I1203 20:09:25.352168 29252 kubelet.go:418] "Attempting to sync node with API server" Dec 03 20:09:25.352215 master-0 kubenswrapper[29252]: I1203 20:09:25.352180 29252 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 20:09:25.352215 master-0 kubenswrapper[29252]: I1203 20:09:25.352195 29252 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 20:09:25.352215 master-0 kubenswrapper[29252]: I1203 20:09:25.352210 29252 kubelet.go:324] "Adding apiserver pod source" Dec 03 20:09:25.352314 master-0 kubenswrapper[29252]: I1203 20:09:25.352224 29252 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 20:09:25.355336 master-0 kubenswrapper[29252]: I1203 20:09:25.355278 29252 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 20:09:25.355522 master-0 kubenswrapper[29252]: I1203 20:09:25.355490 29252 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356294 29252 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356490 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356508 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356515 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356521 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356527 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356533 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356556 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356576 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356584 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356592 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356614 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356626 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 20:09:25.356750 master-0 kubenswrapper[29252]: I1203 20:09:25.356647 29252 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 20:09:25.358294 master-0 kubenswrapper[29252]: I1203 20:09:25.358266 29252 server.go:1280] "Started kubelet" Dec 03 20:09:25.358426 master-0 kubenswrapper[29252]: I1203 20:09:25.358368 29252 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 20:09:25.359722 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 20:09:25.369399 master-0 kubenswrapper[29252]: I1203 20:09:25.358424 29252 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 20:09:25.369399 master-0 kubenswrapper[29252]: I1203 20:09:25.360717 29252 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 20:09:25.369399 master-0 kubenswrapper[29252]: I1203 20:09:25.361130 29252 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 20:09:25.374102 master-0 kubenswrapper[29252]: I1203 20:09:25.372294 29252 server.go:449] "Adding debug handlers to kubelet server" Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379269 29252 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379314 29252 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379367 29252 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379383 29252 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379386 29252 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 19:44:55 +0000 UTC, rotation deadline is 2025-12-04 14:59:30.067765504 +0000 UTC Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379455 29252 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h50m4.688313762s for next certificate rotation Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: E1203 20:09:25.379430 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:25.379496 master-0 kubenswrapper[29252]: I1203 20:09:25.379456 29252 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 20:09:25.381243 master-0 kubenswrapper[29252]: I1203 20:09:25.381127 29252 factory.go:55] Registering systemd factory Dec 03 20:09:25.381243 master-0 kubenswrapper[29252]: I1203 20:09:25.381178 29252 factory.go:221] Registration of the systemd container factory successfully Dec 03 20:09:25.381601 master-0 kubenswrapper[29252]: I1203 20:09:25.381578 29252 factory.go:153] Registering CRI-O factory Dec 03 20:09:25.381601 master-0 kubenswrapper[29252]: I1203 20:09:25.381594 29252 factory.go:221] Registration of the crio container factory successfully Dec 03 20:09:25.381685 master-0 kubenswrapper[29252]: I1203 20:09:25.381666 29252 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 20:09:25.381721 master-0 kubenswrapper[29252]: I1203 20:09:25.381690 29252 factory.go:103] Registering Raw factory Dec 03 20:09:25.381721 master-0 kubenswrapper[29252]: I1203 20:09:25.381705 29252 manager.go:1196] Started watching for new ooms in manager Dec 03 20:09:25.382339 master-0 kubenswrapper[29252]: I1203 20:09:25.382092 29252 manager.go:319] Starting recovery of all containers Dec 03 20:09:25.382749 master-0 kubenswrapper[29252]: E1203 20:09:25.382706 29252 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390104 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2021db5-b27a-4e06-beec-d9ba82aa1ffc" volumeName="kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390180 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390192 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390207 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5f33153-bff1-403f-ae17-b7e90500365d" volumeName="kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390218 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c22cb59-5083-4be6-9998-a9e67a2c20cd" volumeName="kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390229 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390244 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390255 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad22d8ed-2476-441b-aa3b-a7845606b0ac" volumeName="kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390272 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af2023e1-9c7a-40af-a6bf-fba31c3565b1" volumeName="kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390283 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390294 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390308 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="433c3273-c99e-4d68-befc-06f92d2fc8d5" volumeName="kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390319 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e3d36d-1676-4f90-ac9a-d85b861a4655" volumeName="kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390335 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a19b8f9e-6299-43bf-9aa5-22071b855773" volumeName="kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390346 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba68608f-6b36-455e-b80b-d19237df9312" volumeName="kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390361 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd35fc5f-07ab-4c66-9b80-33a598d417ef" volumeName="kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390382 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390397 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390408 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" volumeName="kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390419 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b673cb04-f6f0-4113-bdcd-d6685b942c9f" volumeName="kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390433 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390443 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90610a53-b590-491e-8014-f0704afdc6e1" volumeName="kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390458 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90610a53-b590-491e-8014-f0704afdc6e1" volumeName="kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390468 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad22d8ed-2476-441b-aa3b-a7845606b0ac" volumeName="kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390483 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390519 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390534 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390560 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390578 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390588 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5f33153-bff1-403f-ae17-b7e90500365d" volumeName="kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390599 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f749c7f2-1fd7-4078-a92d-0ae5523998ac" volumeName="kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390611 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390621 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="73b7027e-44f5-4c7b-9226-585a90530535" volumeName="kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390634 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ed25861-1328-45e7-922e-37588a0b019c" volumeName="kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390644 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390654 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390666 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390676 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390689 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c22cb59-5083-4be6-9998-a9e67a2c20cd" volumeName="kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390699 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390709 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390722 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bb19329-c50c-4214-94c8-7e8771b99233" volumeName="kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390734 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390755 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390765 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390790 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7171597-cb9a-451c-80a4-64cfccf885f0" volumeName="kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390804 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390814 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390824 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f82c7a1-ec21-497d-86f2-562cafa7ace7" volumeName="kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390842 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" volumeName="kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390852 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a82ff78-4383-4ca8-8a72-98c2ee50ffe2" volumeName="kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390863 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af2023e1-9c7a-40af-a6bf-fba31c3565b1" volumeName="kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390878 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390893 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8709c6c-8729-4702-a3fb-35a072855096" volumeName="kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390908 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba68608f-6b36-455e-b80b-d19237df9312" volumeName="kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390923 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390939 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390953 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6eb4700c-6af0-468b-afc8-1e09b902d6bf" volumeName="kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390967 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390979 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90610a53-b590-491e-8014-f0704afdc6e1" volumeName="kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.390992 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391003 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" volumeName="kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391014 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" volumeName="kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391028 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391038 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391052 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391063 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3afc439-ccaa-4751-95a1-ac7557e326f0" volumeName="kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391075 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391088 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" volumeName="kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391099 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" volumeName="kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391109 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391131 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f82c7a1-ec21-497d-86f2-562cafa7ace7" volumeName="kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391140 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="433c3273-c99e-4d68-befc-06f92d2fc8d5" volumeName="kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391153 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e3d36d-1676-4f90-ac9a-d85b861a4655" volumeName="kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391162 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="73b7027e-44f5-4c7b-9226-585a90530535" volumeName="kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391173 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09f5df5c-fd9b-430d-aecc-242594b4aff1" volumeName="kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391191 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd" seLinuxMountContext="" Dec 03 20:09:25.391352 master-0 kubenswrapper[29252]: I1203 20:09:25.391201 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391471 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" volumeName="kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391557 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a19b8f9e-6299-43bf-9aa5-22071b855773" volumeName="kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391568 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a710102c-72fb-4d8d-ad99-71940368a09e" volumeName="kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391585 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2021db5-b27a-4e06-beec-d9ba82aa1ffc" volumeName="kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391596 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d196dca7-f940-4aa0-b20a-214d22b62db6" volumeName="kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391608 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09f5df5c-fd9b-430d-aecc-242594b4aff1" volumeName="kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391620 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391630 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="73b7027e-44f5-4c7b-9226-585a90530535" volumeName="kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391642 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" volumeName="kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391652 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391668 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7f613c6-77d6-4cf9-afa0-7c494dee2a8e" volumeName="kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391679 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" volumeName="kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391696 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8709c6c-8729-4702-a3fb-35a072855096" volumeName="kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391709 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba68608f-6b36-455e-b80b-d19237df9312" volumeName="kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391719 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d196dca7-f940-4aa0-b20a-214d22b62db6" volumeName="kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391732 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46b5d4d0-b841-4e87-84b4-85911ff04325" volumeName="kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391743 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391752 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391764 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391807 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391824 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd35fc5f-07ab-4c66-9b80-33a598d417ef" volumeName="kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391835 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" volumeName="kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391844 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391857 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="11e2c94f-f9e9-415b-a550-3006a4632ba4" volumeName="kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391868 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391890 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6404bbc7-8ca9-4f20-8ce7-40f855555160" volumeName="kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391907 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a82ff78-4383-4ca8-8a72-98c2ee50ffe2" volumeName="kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391922 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c22cb59-5083-4be6-9998-a9e67a2c20cd" volumeName="kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391935 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" volumeName="kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391946 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90610a53-b590-491e-8014-f0704afdc6e1" volumeName="kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391961 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b638f207-31df-4298-8801-4da6031deefc" volumeName="kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391975 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.391985 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a19b8f9e-6299-43bf-9aa5-22071b855773" volumeName="kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392000 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7171597-cb9a-451c-80a4-64cfccf885f0" volumeName="kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392015 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392026 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392038 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2021db5-b27a-4e06-beec-d9ba82aa1ffc" volumeName="kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392048 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b673cb04-f6f0-4113-bdcd-d6685b942c9f" volumeName="kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392065 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bb19329-c50c-4214-94c8-7e8771b99233" volumeName="kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392085 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bb19329-c50c-4214-94c8-7e8771b99233" volumeName="kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392095 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392107 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392117 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" volumeName="kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392129 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c22cb59-5083-4be6-9998-a9e67a2c20cd" volumeName="kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392143 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392158 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="433c3273-c99e-4d68-befc-06f92d2fc8d5" volumeName="kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392169 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392185 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392325 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392343 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392363 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="daa8efc0-4514-4a14-80f5-ab9eca53a127" volumeName="kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392394 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9" volumeName="kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392483 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" volumeName="kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392499 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78a864f2-934f-4197-9753-24c9bc7f1fca" volumeName="kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392511 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392521 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a710102c-72fb-4d8d-ad99-71940368a09e" volumeName="kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392531 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af2023e1-9c7a-40af-a6bf-fba31c3565b1" volumeName="kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392543 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b673cb04-f6f0-4113-bdcd-d6685b942c9f" volumeName="kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392553 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56e013ee-ea7a-4780-8986-a7fd1b5a3a3f" volumeName="kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392629 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6eb4700c-6af0-468b-afc8-1e09b902d6bf" volumeName="kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392643 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c52974d8-fbe6-444b-97ae-468482eebac8" volumeName="kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392676 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c52974d8-fbe6-444b-97ae-468482eebac8" volumeName="kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392766 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09f5df5c-fd9b-430d-aecc-242594b4aff1" volumeName="kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392810 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c22cb59-5083-4be6-9998-a9e67a2c20cd" volumeName="kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392836 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f82c7a1-ec21-497d-86f2-562cafa7ace7" volumeName="kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.392922 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="433c3273-c99e-4d68-befc-06f92d2fc8d5" volumeName="kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.393344 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f749c7f2-1fd7-4078-a92d-0ae5523998ac" volumeName="kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.393983 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c52974d8-fbe6-444b-97ae-468482eebac8" volumeName="kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394004 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="128ed384-7ab6-41b6-bf45-c8fda917d52f" volumeName="kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394014 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e3d36d-1676-4f90-ac9a-d85b861a4655" volumeName="kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394024 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6404bbc7-8ca9-4f20-8ce7-40f855555160" volumeName="kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394041 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad22d8ed-2476-441b-aa3b-a7845606b0ac" volumeName="kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394051 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af2023e1-9c7a-40af-a6bf-fba31c3565b1" volumeName="kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394061 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b638f207-31df-4298-8801-4da6031deefc" volumeName="kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394071 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5f33153-bff1-403f-ae17-b7e90500365d" volumeName="kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394082 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" volumeName="kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394093 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f82c7a1-ec21-497d-86f2-562cafa7ace7" volumeName="kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394103 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394113 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394123 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="943feb0d-7d31-446a-9100-dfc4ef013d12" volumeName="kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394133 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394143 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" volumeName="kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394157 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8709c6c-8729-4702-a3fb-35a072855096" volumeName="kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394168 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d210062f-c07e-419f-a551-c37571565686" volumeName="kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394179 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="371917da-b783-4acc-81af-1cfc903269f4" volumeName="kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394189 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" volumeName="kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394199 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46b5d4d0-b841-4e87-84b4-85911ff04325" volumeName="kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394208 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b638f207-31df-4298-8801-4da6031deefc" volumeName="kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394217 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09f5df5c-fd9b-430d-aecc-242594b4aff1" volumeName="kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394227 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="128ed384-7ab6-41b6-bf45-c8fda917d52f" volumeName="kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394236 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394247 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="371917da-b783-4acc-81af-1cfc903269f4" volumeName="kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394257 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9f99422-7991-40ef-92a1-de2e603e47b9" volumeName="kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394267 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" volumeName="kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394276 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394286 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7f613c6-77d6-4cf9-afa0-7c494dee2a8e" volumeName="kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394297 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7f613c6-77d6-4cf9-afa0-7c494dee2a8e" volumeName="kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394306 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6404bbc7-8ca9-4f20-8ce7-40f855555160" volumeName="kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394315 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="830d89af-1266-43ac-b113-990a28595f91" volumeName="kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394325 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f1759a-7df4-442e-a22d-6de8d54be333" volumeName="kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394335 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a710102c-72fb-4d8d-ad99-71940368a09e" volumeName="kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394345 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394353 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="433c3273-c99e-4d68-befc-06f92d2fc8d5" volumeName="kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394362 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394371 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" volumeName="kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394381 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d196dca7-f940-4aa0-b20a-214d22b62db6" volumeName="kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394395 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7171597-cb9a-451c-80a4-64cfccf885f0" volumeName="kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394404 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7f613c6-77d6-4cf9-afa0-7c494dee2a8e" volumeName="kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394414 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f618ea7-3ad7-4dce-b450-a8202285f312" volumeName="kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394423 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" volumeName="kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394433 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3afc439-ccaa-4751-95a1-ac7557e326f0" volumeName="kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394443 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593a75e-c2af-4419-94da-e0c9ff14c41f" volumeName="kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394452 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394462 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e4f88-7106-4a46-8b63-053345922fb0" volumeName="kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394471 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e4f88-7106-4a46-8b63-053345922fb0" volumeName="kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394481 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af2023e1-9c7a-40af-a6bf-fba31c3565b1" volumeName="kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394492 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c52974d8-fbe6-444b-97ae-468482eebac8" volumeName="kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394502 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f96c70ce-314a-4919-91e9-cc776a620846" volumeName="kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394513 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5decce88-c71e-411c-87b5-a37dd0f77e7b" volumeName="kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394523 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" volumeName="kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394533 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad22d8ed-2476-441b-aa3b-a7845606b0ac" volumeName="kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394542 29252 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" volumeName="kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config" seLinuxMountContext="" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394552 29252 reconstruct.go:97] "Volume reconstruction finished" Dec 03 20:09:25.399396 master-0 kubenswrapper[29252]: I1203 20:09:25.394560 29252 reconciler.go:26] "Reconciler: start to sync state" Dec 03 20:09:25.412865 master-0 kubenswrapper[29252]: I1203 20:09:25.412188 29252 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 20:09:25.415205 master-0 kubenswrapper[29252]: I1203 20:09:25.415155 29252 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 20:09:25.415205 master-0 kubenswrapper[29252]: I1203 20:09:25.415210 29252 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 20:09:25.416209 master-0 kubenswrapper[29252]: I1203 20:09:25.415228 29252 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 20:09:25.416209 master-0 kubenswrapper[29252]: E1203 20:09:25.415300 29252 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 20:09:25.424428 master-0 kubenswrapper[29252]: I1203 20:09:25.424345 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/3.log" Dec 03 20:09:25.425529 master-0 kubenswrapper[29252]: I1203 20:09:25.424801 29252 generic.go:334] "Generic (PLEG): container finished" podID="3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf" containerID="d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e" exitCode=1 Dec 03 20:09:25.426589 master-0 kubenswrapper[29252]: I1203 20:09:25.426522 29252 generic.go:334] "Generic (PLEG): container finished" podID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerID="9e705cfbdf86095324ded574be9e84d30f2d828c4c08426be6a6b1ed1158bdf8" exitCode=0 Dec 03 20:09:25.432737 master-0 kubenswrapper[29252]: I1203 20:09:25.432691 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-4fzrl_f9f99422-7991-40ef-92a1-de2e603e47b9/cluster-olm-operator/2.log" Dec 03 20:09:25.434413 master-0 kubenswrapper[29252]: I1203 20:09:25.434380 29252 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="5e83bcf58af1482033711d7ef5e23c1429621a6a16b43c85914ace2af8aca901" exitCode=255 Dec 03 20:09:25.434478 master-0 kubenswrapper[29252]: I1203 20:09:25.434421 29252 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="0651574b36c6a4f52acd96c11c41f938e0a9dc2320440d248364735d4b37969d" exitCode=0 Dec 03 20:09:25.434679 master-0 kubenswrapper[29252]: I1203 20:09:25.434631 29252 generic.go:334] "Generic (PLEG): container finished" podID="f9f99422-7991-40ef-92a1-de2e603e47b9" containerID="431c55fff96bdc81a72543ef7c8b4286f0ecf12b7dc9b0a56daf54373c4eef86" exitCode=0 Dec 03 20:09:25.440134 master-0 kubenswrapper[29252]: I1203 20:09:25.440033 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/2.log" Dec 03 20:09:25.440447 master-0 kubenswrapper[29252]: I1203 20:09:25.440410 29252 generic.go:334] "Generic (PLEG): container finished" podID="433c3273-c99e-4d68-befc-06f92d2fc8d5" containerID="0714d8c339d81fe37d65f8b61284fb17442521338c0d1beb9a6cde0e4b83dcaa" exitCode=1 Dec 03 20:09:25.446485 master-0 kubenswrapper[29252]: I1203 20:09:25.446418 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35" exitCode=0 Dec 03 20:09:25.449134 master-0 kubenswrapper[29252]: I1203 20:09:25.449087 29252 generic.go:334] "Generic (PLEG): container finished" podID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerID="82116db57e57089f2a0aaaa865b4d91e3469d2022d11777a1eb493f1bba12223" exitCode=0 Dec 03 20:09:25.452324 master-0 kubenswrapper[29252]: I1203 20:09:25.452277 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/5.log" Dec 03 20:09:25.452415 master-0 kubenswrapper[29252]: I1203 20:09:25.452333 29252 generic.go:334] "Generic (PLEG): container finished" podID="a185ee17-4b4b-4d20-a8ed-56a2a01f1807" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" exitCode=255 Dec 03 20:09:25.454366 master-0 kubenswrapper[29252]: I1203 20:09:25.454327 29252 generic.go:334] "Generic (PLEG): container finished" podID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerID="5340fe194bb64dbc3aba205027b00290cb2a1905847a3d137e4cd0dbb4900723" exitCode=0 Dec 03 20:09:25.456519 master-0 kubenswrapper[29252]: I1203 20:09:25.456485 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/3.log" Dec 03 20:09:25.456604 master-0 kubenswrapper[29252]: I1203 20:09:25.456517 29252 generic.go:334] "Generic (PLEG): container finished" podID="f749c7f2-1fd7-4078-a92d-0ae5523998ac" containerID="f7b2dd4d7eafdc4336ee0182ab9a0527c12ff38408c8d52991e189907554e424" exitCode=255 Dec 03 20:09:25.460353 master-0 kubenswrapper[29252]: I1203 20:09:25.460209 29252 generic.go:334] "Generic (PLEG): container finished" podID="6bb19329-c50c-4214-94c8-7e8771b99233" containerID="58ad9d8d299c84cf4870b5819091b740262ae4d0d8ffa65ef713656d5a0160a8" exitCode=0 Dec 03 20:09:25.460353 master-0 kubenswrapper[29252]: I1203 20:09:25.460238 29252 generic.go:334] "Generic (PLEG): container finished" podID="6bb19329-c50c-4214-94c8-7e8771b99233" containerID="13bde77208cb39b575d114d9d173756c7e7bb201950243c772caea7e6104ce2d" exitCode=0 Dec 03 20:09:25.462727 master-0 kubenswrapper[29252]: I1203 20:09:25.462304 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-pqz7q_0d4e4f88-7106-4a46-8b63-053345922fb0/package-server-manager/0.log" Dec 03 20:09:25.464186 master-0 kubenswrapper[29252]: I1203 20:09:25.463990 29252 generic.go:334] "Generic (PLEG): container finished" podID="0d4e4f88-7106-4a46-8b63-053345922fb0" containerID="2f3d798fc128d08f2b78c16a96552eb1af844c024c5ff08c6a9c3b2ad0da6b71" exitCode=1 Dec 03 20:09:25.466919 master-0 kubenswrapper[29252]: I1203 20:09:25.466839 29252 generic.go:334] "Generic (PLEG): container finished" podID="c593a75e-c2af-4419-94da-e0c9ff14c41f" containerID="6652c1726daa2f760a59f8139ccfc6f5f17852cbb0841f5678084529cf67893c" exitCode=0 Dec 03 20:09:25.477194 master-0 kubenswrapper[29252]: I1203 20:09:25.477155 29252 generic.go:334] "Generic (PLEG): container finished" podID="f96c70ce-314a-4919-91e9-cc776a620846" containerID="e2b66d198b3f4fe0e6018d9d1aa589a9d8ed0ff0683d77115b9a3013153ec256" exitCode=0 Dec 03 20:09:25.481327 master-0 kubenswrapper[29252]: E1203 20:09:25.479579 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:25.482316 master-0 kubenswrapper[29252]: I1203 20:09:25.482088 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-p7xj5_11e2c94f-f9e9-415b-a550-3006a4632ba4/kube-storage-version-migrator-operator/4.log" Dec 03 20:09:25.482316 master-0 kubenswrapper[29252]: I1203 20:09:25.482130 29252 generic.go:334] "Generic (PLEG): container finished" podID="11e2c94f-f9e9-415b-a550-3006a4632ba4" containerID="89ed390af07eecb0f2a6fd24fe986b57e8e8f83dbf2ff2202963967a2fcc7b5e" exitCode=255 Dec 03 20:09:25.484439 master-0 kubenswrapper[29252]: I1203 20:09:25.484246 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/4.log" Dec 03 20:09:25.487542 master-0 kubenswrapper[29252]: I1203 20:09:25.487260 29252 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="30e0205b9f3aae7684b5e5dd37ee0470857f4a7020b8a45ab64071c7372511a7" exitCode=255 Dec 03 20:09:25.487542 master-0 kubenswrapper[29252]: I1203 20:09:25.487303 29252 generic.go:334] "Generic (PLEG): container finished" podID="0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9" containerID="ebce70450136604f7c52ead6ab27edb4126b2802849c71ec6e71d90ddadab566" exitCode=0 Dec 03 20:09:25.495561 master-0 kubenswrapper[29252]: I1203 20:09:25.495453 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/3.log" Dec 03 20:09:25.495561 master-0 kubenswrapper[29252]: I1203 20:09:25.495491 29252 generic.go:334] "Generic (PLEG): container finished" podID="78a864f2-934f-4197-9753-24c9bc7f1fca" containerID="9f244f1d436466a4ae57b971d0160d2b30815a69ea07caf71d6b0728312b0abd" exitCode=255 Dec 03 20:09:25.503081 master-0 kubenswrapper[29252]: I1203 20:09:25.502934 29252 generic.go:334] "Generic (PLEG): container finished" podID="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" containerID="aa9d0cc86210e7d9335ee33dc0e24caf30866ce853c547c220c347b3bc7052c9" exitCode=0 Dec 03 20:09:25.503081 master-0 kubenswrapper[29252]: I1203 20:09:25.502964 29252 generic.go:334] "Generic (PLEG): container finished" podID="cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2" containerID="199eaf1616b5ae06926193f7d4e723c00bcb81929b670fb413bd36d7bf6e1d63" exitCode=0 Dec 03 20:09:25.505769 master-0 kubenswrapper[29252]: I1203 20:09:25.505741 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/3.log" Dec 03 20:09:25.506081 master-0 kubenswrapper[29252]: I1203 20:09:25.506053 29252 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385" exitCode=1 Dec 03 20:09:25.506081 master-0 kubenswrapper[29252]: I1203 20:09:25.506076 29252 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c" exitCode=0 Dec 03 20:09:25.508641 master-0 kubenswrapper[29252]: I1203 20:09:25.508613 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-w7d8t_6eb4700c-6af0-468b-afc8-1e09b902d6bf/network-operator/4.log" Dec 03 20:09:25.508727 master-0 kubenswrapper[29252]: I1203 20:09:25.508651 29252 generic.go:334] "Generic (PLEG): container finished" podID="6eb4700c-6af0-468b-afc8-1e09b902d6bf" containerID="728aa51e420a0e8c358ef69d6ddcb175d50c7be37aab4f4fdfde93a0791a7b8e" exitCode=255 Dec 03 20:09:25.510383 master-0 kubenswrapper[29252]: I1203 20:09:25.510360 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-vkcnf_73b7027e-44f5-4c7b-9226-585a90530535/manager/0.log" Dec 03 20:09:25.510464 master-0 kubenswrapper[29252]: I1203 20:09:25.510394 29252 generic.go:334] "Generic (PLEG): container finished" podID="73b7027e-44f5-4c7b-9226-585a90530535" containerID="3595f145ca5f9a4066302e9ae5d79e04995d58d28db2a03322a4e2a341e9fec2" exitCode=1 Dec 03 20:09:25.514260 master-0 kubenswrapper[29252]: I1203 20:09:25.514229 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-vqzdb_7ed25861-1328-45e7-922e-37588a0b019c/cluster-node-tuning-operator/0.log" Dec 03 20:09:25.514428 master-0 kubenswrapper[29252]: I1203 20:09:25.514384 29252 generic.go:334] "Generic (PLEG): container finished" podID="7ed25861-1328-45e7-922e-37588a0b019c" containerID="b15d5b3401a95a50f5c18b6410300731cd922d460a927b29c822856e4c00523b" exitCode=1 Dec 03 20:09:25.515457 master-0 kubenswrapper[29252]: E1203 20:09:25.515445 29252 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 20:09:25.515667 master-0 kubenswrapper[29252]: I1203 20:09:25.515656 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/0.log" Dec 03 20:09:25.515744 master-0 kubenswrapper[29252]: I1203 20:09:25.515732 29252 generic.go:334] "Generic (PLEG): container finished" podID="a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6" containerID="8a74abebd0e92eb267bf92fa216f251466a061d49782c0f5612aabcb75ab61c6" exitCode=1 Dec 03 20:09:25.521518 master-0 kubenswrapper[29252]: I1203 20:09:25.521410 29252 generic.go:334] "Generic (PLEG): container finished" podID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerID="cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161" exitCode=0 Dec 03 20:09:25.524968 master-0 kubenswrapper[29252]: I1203 20:09:25.524940 29252 generic.go:334] "Generic (PLEG): container finished" podID="a710102c-72fb-4d8d-ad99-71940368a09e" containerID="c62ca34d47648391f608e6f2fa80f298167bf660cde830ab9846e95ff4484b7f" exitCode=0 Dec 03 20:09:25.525073 master-0 kubenswrapper[29252]: I1203 20:09:25.525060 29252 generic.go:334] "Generic (PLEG): container finished" podID="a710102c-72fb-4d8d-ad99-71940368a09e" containerID="90e536e37d10c97618a40c363b7fe1c09180dc7a8bef1b5767ffc36ddc8dad7f" exitCode=0 Dec 03 20:09:25.528465 master-0 kubenswrapper[29252]: I1203 20:09:25.528437 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/0.log" Dec 03 20:09:25.529059 master-0 kubenswrapper[29252]: I1203 20:09:25.529017 29252 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e" exitCode=1 Dec 03 20:09:25.529059 master-0 kubenswrapper[29252]: I1203 20:09:25.529054 29252 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="384902c9d5118b992b516df4665219d1bebf7324327cde78b939566df8720f4b" exitCode=0 Dec 03 20:09:25.530502 master-0 kubenswrapper[29252]: I1203 20:09:25.530467 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-79ccj_e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3/kube-controller-manager-operator/4.log" Dec 03 20:09:25.530626 master-0 kubenswrapper[29252]: I1203 20:09:25.530518 29252 generic.go:334] "Generic (PLEG): container finished" podID="e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3" containerID="64faeeb7a4647a9e5dd702400fe60f14013f02b00360bb310c4d37859f33d70c" exitCode=255 Dec 03 20:09:25.537209 master-0 kubenswrapper[29252]: I1203 20:09:25.537166 29252 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="7ffe9984ab39638ad7730b79c49181e26ef0a2e2748c84910693d2353db0a811" exitCode=0 Dec 03 20:09:25.537209 master-0 kubenswrapper[29252]: I1203 20:09:25.537201 29252 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="3c8f577be66a40b37f0664a12c17056548ea3c9d36cd14f671ca30ad04cfd997" exitCode=0 Dec 03 20:09:25.537309 master-0 kubenswrapper[29252]: I1203 20:09:25.537213 29252 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="121d9626cd0411e9b91e157dd5da2678c7631550b10f391133d8192123b5c231" exitCode=0 Dec 03 20:09:25.540006 master-0 kubenswrapper[29252]: I1203 20:09:25.539963 29252 generic.go:334] "Generic (PLEG): container finished" podID="d210062f-c07e-419f-a551-c37571565686" containerID="2d7be3731fbc745283a2d759f396c31ac1367c0ba714305c646e32b354747fdc" exitCode=0 Dec 03 20:09:25.543376 master-0 kubenswrapper[29252]: I1203 20:09:25.543334 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_9afe01c7-825c-43d1-8425-0317cdde11d6/installer/0.log" Dec 03 20:09:25.543462 master-0 kubenswrapper[29252]: I1203 20:09:25.543399 29252 generic.go:334] "Generic (PLEG): container finished" podID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerID="7defd583f52b28f4c8a42f8533bc6a235b9b9753c15d53b3d581070bd6b239c4" exitCode=1 Dec 03 20:09:25.546523 master-0 kubenswrapper[29252]: I1203 20:09:25.546496 29252 generic.go:334] "Generic (PLEG): container finished" podID="b673cb04-f6f0-4113-bdcd-d6685b942c9f" containerID="efb0326864f224addc60569e753ed4f7ba080c2fc63c85d174a9de0f4aa3dad6" exitCode=0 Dec 03 20:09:25.548084 master-0 kubenswrapper[29252]: I1203 20:09:25.548061 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-6b8bb995f7-bj4vz_63e3d36d-1676-4f90-ac9a-d85b861a4655/service-ca-controller/2.log" Dec 03 20:09:25.548146 master-0 kubenswrapper[29252]: I1203 20:09:25.548094 29252 generic.go:334] "Generic (PLEG): container finished" podID="63e3d36d-1676-4f90-ac9a-d85b861a4655" containerID="9bdf161e72b6c048ac479aec18a819118a43011cc40adece64e6528d1dc8ecda" exitCode=255 Dec 03 20:09:25.551069 master-0 kubenswrapper[29252]: I1203 20:09:25.551030 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r2kpn_c4d45235-fb1a-4626-a41e-b1e34f7bf76e/approver/0.log" Dec 03 20:09:25.551404 master-0 kubenswrapper[29252]: I1203 20:09:25.551351 29252 generic.go:334] "Generic (PLEG): container finished" podID="c4d45235-fb1a-4626-a41e-b1e34f7bf76e" containerID="65f13f5f310f6f953b71a1a783c24c03bd5eb6d2106c3ba74515208177e8e054" exitCode=1 Dec 03 20:09:25.553367 master-0 kubenswrapper[29252]: I1203 20:09:25.553168 29252 generic.go:334] "Generic (PLEG): container finished" podID="b638f207-31df-4298-8801-4da6031deefc" containerID="d2c1886a2860f8a9cfc62feb851502428fec91b03a3c1244620b2a342cd94941" exitCode=0 Dec 03 20:09:25.553367 master-0 kubenswrapper[29252]: I1203 20:09:25.553202 29252 generic.go:334] "Generic (PLEG): container finished" podID="b638f207-31df-4298-8801-4da6031deefc" containerID="faf3a48e7c674daa85ae24cd3640d8c54a246a72784d7207fb68637d0b2401d5" exitCode=0 Dec 03 20:09:25.554759 master-0 kubenswrapper[29252]: I1203 20:09:25.554735 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-v6rp5_01d51d9a-9beb-4357-9dc2-aeac210cd0c4/service-ca-operator/4.log" Dec 03 20:09:25.554838 master-0 kubenswrapper[29252]: I1203 20:09:25.554766 29252 generic.go:334] "Generic (PLEG): container finished" podID="01d51d9a-9beb-4357-9dc2-aeac210cd0c4" containerID="c2730eaef31938f9b283223c81622c1d4bbc549630ded57fc1762a2568d60b23" exitCode=255 Dec 03 20:09:25.557294 master-0 kubenswrapper[29252]: I1203 20:09:25.557254 29252 generic.go:334] "Generic (PLEG): container finished" podID="e73e6013-87fc-40e2-a573-39930828faa7" containerID="a73aec75e1cef31c969c506854b5ac02887023e7a9ddf7c907ea711f21b91d25" exitCode=0 Dec 03 20:09:25.560289 master-0 kubenswrapper[29252]: I1203 20:09:25.560219 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="b2c2ebffcad93a655874c4b2c0e0dae1edf07cc0c8e231705d220b5fe6aadf15" exitCode=0 Dec 03 20:09:25.560289 master-0 kubenswrapper[29252]: I1203 20:09:25.560267 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="a396f10beccb65f07ed52d9f7eed56b73ee45537150d1fb69cde98622f0ce32a" exitCode=0 Dec 03 20:09:25.560289 master-0 kubenswrapper[29252]: I1203 20:09:25.560286 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="3e816effb094becdc3c407acbb3f9f27817216cdbfc7352da3c72fba2c274e3e" exitCode=0 Dec 03 20:09:25.560447 master-0 kubenswrapper[29252]: I1203 20:09:25.560301 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="5e06cf682588907f65a412d4ac6d4481e139ecf6ab4739442acce6158ba8872d" exitCode=0 Dec 03 20:09:25.560447 master-0 kubenswrapper[29252]: I1203 20:09:25.560316 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="afd903622e2f7d6d9391f2df58084fdf90b41e4e17808cb5e2d5c792f644b6df" exitCode=0 Dec 03 20:09:25.560447 master-0 kubenswrapper[29252]: I1203 20:09:25.560329 29252 generic.go:334] "Generic (PLEG): container finished" podID="87f1759a-7df4-442e-a22d-6de8d54be333" containerID="9f0406d26b61880d05d604bbabebaeef16d5bda27cf4f4f9e097201539e44456" exitCode=0 Dec 03 20:09:25.565487 master-0 kubenswrapper[29252]: I1203 20:09:25.565324 29252 generic.go:334] "Generic (PLEG): container finished" podID="2f618ea7-3ad7-4dce-b450-a8202285f312" containerID="dddd03afbbaf28bd7aa58c27ce415ad910bb5c941f19a9c53d3832794bc71ce3" exitCode=0 Dec 03 20:09:25.566886 master-0 kubenswrapper[29252]: I1203 20:09:25.566857 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/3.log" Dec 03 20:09:25.566952 master-0 kubenswrapper[29252]: I1203 20:09:25.566893 29252 generic.go:334] "Generic (PLEG): container finished" podID="367c2c7c-1fc8-4608-aa94-b64c6c70cc61" containerID="8112a7cb98ed4f9746283158ddbbb35ec5fbfefafdb864fd1afaa4c7f81f5842" exitCode=1 Dec 03 20:09:25.569179 master-0 kubenswrapper[29252]: I1203 20:09:25.569139 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-j2wgx_5b3ee9a2-0f17-4a04-9191-b60684ef6c29/kube-scheduler-operator-container/3.log" Dec 03 20:09:25.569251 master-0 kubenswrapper[29252]: I1203 20:09:25.569195 29252 generic.go:334] "Generic (PLEG): container finished" podID="5b3ee9a2-0f17-4a04-9191-b60684ef6c29" containerID="0261bc02d30c23a023b1b2c969bc5effe6635690c48ec42070b21b48058d37f0" exitCode=255 Dec 03 20:09:25.573256 master-0 kubenswrapper[29252]: I1203 20:09:25.573222 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/manager/0.log" Dec 03 20:09:25.573527 master-0 kubenswrapper[29252]: I1203 20:09:25.573490 29252 generic.go:334] "Generic (PLEG): container finished" podID="1f82c7a1-ec21-497d-86f2-562cafa7ace7" containerID="026026ef6ee70bf24fbc2d66c86cdbf2ce61498e9a51c23017b8994c7f1700dd" exitCode=1 Dec 03 20:09:25.576066 master-0 kubenswrapper[29252]: I1203 20:09:25.576034 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-8jzqh_daa8efc0-4514-4a14-80f5-ab9eca53a127/openshift-controller-manager-operator/4.log" Dec 03 20:09:25.576124 master-0 kubenswrapper[29252]: I1203 20:09:25.576069 29252 generic.go:334] "Generic (PLEG): container finished" podID="daa8efc0-4514-4a14-80f5-ab9eca53a127" containerID="2fddc42d6267903d2d9ec20253e1576f35e19a3bb53e9ddf0c42ac6c45e614ec" exitCode=255 Dec 03 20:09:25.578696 master-0 kubenswrapper[29252]: I1203 20:09:25.578658 29252 generic.go:334] "Generic (PLEG): container finished" podID="b8709c6c-8729-4702-a3fb-35a072855096" containerID="f74560024271b473d288e14ac60c9ecd05f2a6752be21eac89b4a74e35f9a5d8" exitCode=0 Dec 03 20:09:25.579826 master-0 kubenswrapper[29252]: I1203 20:09:25.579768 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_186cc14f-5f58-43ca-8ffa-db07606ff0f7/installer/0.log" Dec 03 20:09:25.579892 master-0 kubenswrapper[29252]: E1203 20:09:25.579822 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:25.579892 master-0 kubenswrapper[29252]: I1203 20:09:25.579829 29252 generic.go:334] "Generic (PLEG): container finished" podID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerID="5217957523f4b5166716d8ff3b268cfc1e054e38ab89fcd916d9adc0a629dce1" exitCode=1 Dec 03 20:09:25.583437 master-0 kubenswrapper[29252]: I1203 20:09:25.583411 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/2.log" Dec 03 20:09:25.583493 master-0 kubenswrapper[29252]: I1203 20:09:25.583448 29252 generic.go:334] "Generic (PLEG): container finished" podID="b84835e3-e8bc-4aa4-a8f3-f9be702a358a" containerID="cec06c56e683cc0577fad0a71ec4c6d696a85de6a5454d15d4616410438d6c01" exitCode=255 Dec 03 20:09:25.584582 master-0 kubenswrapper[29252]: I1203 20:09:25.584563 29252 generic.go:334] "Generic (PLEG): container finished" podID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerID="6734488c6ce6905e5e770b668e83066dd3b8267a0d3cf0d97567edcd50a10461" exitCode=0 Dec 03 20:09:25.586190 master-0 kubenswrapper[29252]: I1203 20:09:25.586168 29252 generic.go:334] "Generic (PLEG): container finished" podID="5decce88-c71e-411c-87b5-a37dd0f77e7b" containerID="ce3971a00b14ee7d8820c7e2ce38f070172641049e39dce3eb3a076d83a464ea" exitCode=0 Dec 03 20:09:25.588118 master-0 kubenswrapper[29252]: I1203 20:09:25.588086 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-lsltt_d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f/openshift-apiserver-operator/4.log" Dec 03 20:09:25.588118 master-0 kubenswrapper[29252]: I1203 20:09:25.588114 29252 generic.go:334] "Generic (PLEG): container finished" podID="d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f" containerID="db25cf44f0675c418850d8d41463efcb1765ff94722958664210b9165ac00ff3" exitCode=255 Dec 03 20:09:25.590070 master-0 kubenswrapper[29252]: I1203 20:09:25.590053 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/4.log" Dec 03 20:09:25.590225 master-0 kubenswrapper[29252]: I1203 20:09:25.590206 29252 generic.go:334] "Generic (PLEG): container finished" podID="943feb0d-7d31-446a-9100-dfc4ef013d12" containerID="abf1acea0f13046f42e18d29f9f01a5591776e77d3e8cc4b525da74b968fc06b" exitCode=255 Dec 03 20:09:25.592190 master-0 kubenswrapper[29252]: I1203 20:09:25.592153 29252 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52" exitCode=0 Dec 03 20:09:25.594257 master-0 kubenswrapper[29252]: I1203 20:09:25.594236 29252 generic.go:334] "Generic (PLEG): container finished" podID="8dbbb6f8-711c-49a0-bc36-fa5d50124bd8" containerID="33fc3458349b78bc19c8b30395e299c49cdfbf37f7e541929fe27fba4fc59440" exitCode=0 Dec 03 20:09:25.679954 master-0 kubenswrapper[29252]: E1203 20:09:25.679923 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:25.715671 master-0 kubenswrapper[29252]: E1203 20:09:25.715624 29252 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 20:09:25.742922 master-0 kubenswrapper[29252]: I1203 20:09:25.742877 29252 manager.go:324] Recovery completed Dec 03 20:09:25.780628 master-0 kubenswrapper[29252]: E1203 20:09:25.780504 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:25.804590 master-0 kubenswrapper[29252]: I1203 20:09:25.804553 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:25.809713 master-0 kubenswrapper[29252]: I1203 20:09:25.809690 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:25.809875 master-0 kubenswrapper[29252]: I1203 20:09:25.809862 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:25.809943 master-0 kubenswrapper[29252]: I1203 20:09:25.809934 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:25.814828 master-0 kubenswrapper[29252]: I1203 20:09:25.814738 29252 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 20:09:25.814828 master-0 kubenswrapper[29252]: I1203 20:09:25.814817 29252 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 20:09:25.814924 master-0 kubenswrapper[29252]: I1203 20:09:25.814850 29252 state_mem.go:36] "Initialized new in-memory state store" Dec 03 20:09:25.815354 master-0 kubenswrapper[29252]: I1203 20:09:25.815324 29252 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 03 20:09:25.815412 master-0 kubenswrapper[29252]: I1203 20:09:25.815348 29252 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 03 20:09:25.815412 master-0 kubenswrapper[29252]: I1203 20:09:25.815378 29252 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 03 20:09:25.815412 master-0 kubenswrapper[29252]: I1203 20:09:25.815386 29252 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 03 20:09:25.815412 master-0 kubenswrapper[29252]: I1203 20:09:25.815393 29252 policy_none.go:49] "None policy: Start" Dec 03 20:09:25.818991 master-0 kubenswrapper[29252]: I1203 20:09:25.818914 29252 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 20:09:25.818991 master-0 kubenswrapper[29252]: I1203 20:09:25.818980 29252 state_mem.go:35] "Initializing new in-memory state store" Dec 03 20:09:25.819277 master-0 kubenswrapper[29252]: I1203 20:09:25.819250 29252 state_mem.go:75] "Updated machine memory state" Dec 03 20:09:25.819277 master-0 kubenswrapper[29252]: I1203 20:09:25.819270 29252 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 03 20:09:25.834209 master-0 kubenswrapper[29252]: I1203 20:09:25.834171 29252 manager.go:334] "Starting Device Plugin manager" Dec 03 20:09:25.834289 master-0 kubenswrapper[29252]: I1203 20:09:25.834226 29252 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 20:09:25.834289 master-0 kubenswrapper[29252]: I1203 20:09:25.834240 29252 server.go:79] "Starting device plugin registration server" Dec 03 20:09:25.834645 master-0 kubenswrapper[29252]: I1203 20:09:25.834611 29252 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 20:09:25.834681 master-0 kubenswrapper[29252]: I1203 20:09:25.834642 29252 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 20:09:25.835274 master-0 kubenswrapper[29252]: I1203 20:09:25.835210 29252 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 20:09:25.835441 master-0 kubenswrapper[29252]: I1203 20:09:25.835374 29252 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 20:09:25.835441 master-0 kubenswrapper[29252]: I1203 20:09:25.835398 29252 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 20:09:25.843092 master-0 kubenswrapper[29252]: E1203 20:09:25.843054 29252 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 20:09:25.934879 master-0 kubenswrapper[29252]: I1203 20:09:25.934772 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:25.937460 master-0 kubenswrapper[29252]: I1203 20:09:25.937409 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:25.937719 master-0 kubenswrapper[29252]: I1203 20:09:25.937687 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:25.937974 master-0 kubenswrapper[29252]: I1203 20:09:25.937944 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:25.938213 master-0 kubenswrapper[29252]: I1203 20:09:25.938184 29252 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 20:09:25.943979 master-0 kubenswrapper[29252]: E1203 20:09:25.943924 29252 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 20:09:26.115999 master-0 kubenswrapper[29252]: I1203 20:09:26.115853 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:09:26.116300 master-0 kubenswrapper[29252]: I1203 20:09:26.116010 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.118341 master-0 kubenswrapper[29252]: I1203 20:09:26.118290 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.118341 master-0 kubenswrapper[29252]: I1203 20:09:26.118315 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.118341 master-0 kubenswrapper[29252]: I1203 20:09:26.118323 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.119850 master-0 kubenswrapper[29252]: I1203 20:09:26.118410 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.120737 master-0 kubenswrapper[29252]: I1203 20:09:26.120698 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.122455 master-0 kubenswrapper[29252]: I1203 20:09:26.122384 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.122595 master-0 kubenswrapper[29252]: I1203 20:09:26.122480 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.122595 master-0 kubenswrapper[29252]: I1203 20:09:26.122510 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.122825 master-0 kubenswrapper[29252]: I1203 20:09:26.122645 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.122825 master-0 kubenswrapper[29252]: I1203 20:09:26.122701 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.122825 master-0 kubenswrapper[29252]: I1203 20:09:26.122721 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.123102 master-0 kubenswrapper[29252]: I1203 20:09:26.123028 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.124088 master-0 kubenswrapper[29252]: I1203 20:09:26.124027 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.128228 master-0 kubenswrapper[29252]: I1203 20:09:26.128186 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.128228 master-0 kubenswrapper[29252]: I1203 20:09:26.128219 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.128228 master-0 kubenswrapper[29252]: I1203 20:09:26.128233 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.128477 master-0 kubenswrapper[29252]: I1203 20:09:26.128212 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.128477 master-0 kubenswrapper[29252]: I1203 20:09:26.128423 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.128477 master-0 kubenswrapper[29252]: I1203 20:09:26.128461 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.128701 master-0 kubenswrapper[29252]: I1203 20:09:26.128658 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.129055 master-0 kubenswrapper[29252]: I1203 20:09:26.129010 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.132599 master-0 kubenswrapper[29252]: I1203 20:09:26.132554 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.132599 master-0 kubenswrapper[29252]: I1203 20:09:26.132583 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.132599 master-0 kubenswrapper[29252]: I1203 20:09:26.132581 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.132923 master-0 kubenswrapper[29252]: I1203 20:09:26.132627 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.132923 master-0 kubenswrapper[29252]: I1203 20:09:26.132650 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.132923 master-0 kubenswrapper[29252]: I1203 20:09:26.132591 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.133088 master-0 kubenswrapper[29252]: I1203 20:09:26.133026 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.133390 master-0 kubenswrapper[29252]: I1203 20:09:26.133338 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.134948 master-0 kubenswrapper[29252]: I1203 20:09:26.134911 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.134948 master-0 kubenswrapper[29252]: I1203 20:09:26.134941 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.134948 master-0 kubenswrapper[29252]: I1203 20:09:26.134950 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.135161 master-0 kubenswrapper[29252]: I1203 20:09:26.135048 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.135290 master-0 kubenswrapper[29252]: I1203 20:09:26.135248 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.135530 master-0 kubenswrapper[29252]: I1203 20:09:26.135499 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.135676 master-0 kubenswrapper[29252]: I1203 20:09:26.135656 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.135895 master-0 kubenswrapper[29252]: I1203 20:09:26.135774 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.136638 master-0 kubenswrapper[29252]: I1203 20:09:26.136601 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.136638 master-0 kubenswrapper[29252]: I1203 20:09:26.136625 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.136638 master-0 kubenswrapper[29252]: I1203 20:09:26.136633 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136732 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c728a08bf34863d26a3eb03645de957a75f62d9852b3c9d02cdccd664afb9f13" Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136750 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb66039883a03fd1626aa3dffc21a20bb7b9e0bf48c135b576d0ba2ac23105d3" Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136767 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136827 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136838 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136846 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136857 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerDied","Data":"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136869 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"1caef0a819570cf6a2811866d8d10fd6e09b188be5e4d722967523e3ffefcc98"} Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136877 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136885 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40375e8c9304a9008fd3f0ffbd7abeeba9e1599c1f09821321074397cef514ba" Dec 03 20:09:26.136887 master-0 kubenswrapper[29252]: I1203 20:09:26.136902 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2dc381563df2a0e13918cfa2451a9b174e4604bd05cb59f08912f9e42b984c0" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137024 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137039 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137049 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137039 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137139 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137149 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137159 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137167 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"d48938de69765a143714e3f72409a39d0152006d3aa2fff72b2bf45a3ae1e272"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137183 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"613dced068ceb2df4bbce683ccba9c87ef2fc3f6a3e401852118424ac1bf3a4c"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137192 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"1e627b854436f132d47750eca5e55963c07ce2a82bb65e7317d2c359a44e0385"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137201 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"fc36d2a6c391f335aef0b36d050ebf1f8ee2adf514fce8229acd7a314425647c"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137210 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"46b628f030def8d568abe6c88697be71ce064596569bc0a66bddd83c9802cf26"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137242 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e223c914bb9fdab3679b22e12a3423e70834ea2d5e7b1b525318a3b2a1eb7382" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137249 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b0f7c518a656139710b17a7667c8b898","Type":"ContainerStarted","Data":"10dd5e50757ca6d8fb428d9d41440e88b1cc3fce51685a0860bb2b0898ea0950"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137258 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b0f7c518a656139710b17a7667c8b898","Type":"ContainerStarted","Data":"fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137281 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"88a426b4c066f4efd6c67dba2d50d1674139b8757075139f8541302d74a32ce6"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137289 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"a72510073f92e9ff068e8652b1a65285f64ee333e40d80be23e60bf13a3ce72d"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137298 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerDied","Data":"89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137308 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"73fd77c7f3160f50b85cebcaf7773a33c44b0958115b084cb590bef38d48ba5c"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137318 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerDied","Data":"384902c9d5118b992b516df4665219d1bebf7324327cde78b939566df8720f4b"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137326 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"c46583dca69d50bb12bc004d7ee3300f","Type":"ContainerStarted","Data":"d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137341 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"230fc5938de0c5a6e2516202d99d270da453c1967a8773858a25118455179d5a"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137352 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"ed490ef4ea8d419c12fbad3b98e447ddc9c1f2075c437754f8b557557383a2df"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137361 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"7134dda62a594c58ac76c0bee69ff785ac0ff610ef7d3c4df129e50bb11aec80"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137370 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"d4df7bcfbcc85bcadd5d89c40467a0c62a261fe9df1907801d9e1c35e6fc353d"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137379 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"0e716c369f30bcb4fd885d5df2bfefde9afaf605da0247b8ed1b0e099f4fccca"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137387 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"7ffe9984ab39638ad7730b79c49181e26ef0a2e2748c84910693d2353db0a811"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137397 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"3c8f577be66a40b37f0664a12c17056548ea3c9d36cd14f671ca30ad04cfd997"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137406 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"121d9626cd0411e9b91e157dd5da2678c7631550b10f391133d8192123b5c231"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137415 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"1992b1130615c3114c9b58cd6decbf77558f0295aafbe17982440031c3ee9788"} Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137439 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01898ca09cc6e5ead466458571ed251bc45975a2add401e6cca184da08be158" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137484 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458285225e2ebef4d74a454c189a4334305b7449f5fc5767f5024ba6cedb0614" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137546 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e1b375dcaebdf8d6351192223452d8b91294cb866b8e2a93c4bc9df5e70f90" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137561 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9dc51ef0fc54ef8c73610d3365ca5738cac69e45dfc432bfd97fab8a56b1782" Dec 03 20:09:26.137720 master-0 kubenswrapper[29252]: I1203 20:09:26.137637 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7672f753235f31861db5762e7805d7dbeffaa2c208518211750ae8f4c45f42" Dec 03 20:09:26.139683 master-0 kubenswrapper[29252]: I1203 20:09:26.138116 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.139683 master-0 kubenswrapper[29252]: I1203 20:09:26.138130 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.139683 master-0 kubenswrapper[29252]: I1203 20:09:26.138138 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.145076 master-0 kubenswrapper[29252]: I1203 20:09:26.145039 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.146852 master-0 kubenswrapper[29252]: I1203 20:09:26.146772 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.146852 master-0 kubenswrapper[29252]: I1203 20:09:26.146841 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.146852 master-0 kubenswrapper[29252]: I1203 20:09:26.146855 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.147164 master-0 kubenswrapper[29252]: I1203 20:09:26.146877 29252 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 20:09:26.151468 master-0 kubenswrapper[29252]: E1203 20:09:26.151419 29252 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 20:09:26.552190 master-0 kubenswrapper[29252]: I1203 20:09:26.552028 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:26.554924 master-0 kubenswrapper[29252]: I1203 20:09:26.554869 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:26.554924 master-0 kubenswrapper[29252]: I1203 20:09:26.554919 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:26.554924 master-0 kubenswrapper[29252]: I1203 20:09:26.554933 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:26.555234 master-0 kubenswrapper[29252]: I1203 20:09:26.554956 29252 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 20:09:26.559156 master-0 kubenswrapper[29252]: E1203 20:09:26.559110 29252 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 20:09:27.359870 master-0 kubenswrapper[29252]: I1203 20:09:27.359701 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:27.363018 master-0 kubenswrapper[29252]: I1203 20:09:27.362959 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:27.363164 master-0 kubenswrapper[29252]: I1203 20:09:27.363024 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:27.363164 master-0 kubenswrapper[29252]: I1203 20:09:27.363036 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:27.363164 master-0 kubenswrapper[29252]: I1203 20:09:27.363078 29252 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 20:09:27.367718 master-0 kubenswrapper[29252]: E1203 20:09:27.367661 29252 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 20:09:28.968082 master-0 kubenswrapper[29252]: I1203 20:09:28.968029 29252 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 20:09:28.970181 master-0 kubenswrapper[29252]: I1203 20:09:28.970134 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 20:09:28.970261 master-0 kubenswrapper[29252]: I1203 20:09:28.970194 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 20:09:28.970261 master-0 kubenswrapper[29252]: I1203 20:09:28.970206 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 20:09:28.970261 master-0 kubenswrapper[29252]: I1203 20:09:28.970227 29252 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 20:09:29.032278 master-0 kubenswrapper[29252]: I1203 20:09:29.032152 29252 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 03 20:09:29.032534 master-0 kubenswrapper[29252]: I1203 20:09:29.032372 29252 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 20:09:29.032534 master-0 kubenswrapper[29252]: E1203 20:09:29.032402 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 20:09:29.037040 master-0 kubenswrapper[29252]: I1203 20:09:29.036992 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeNotReady" Dec 03 20:09:29.037206 master-0 kubenswrapper[29252]: I1203 20:09:29.037045 29252 setters.go:603] "Node became not ready" node="master-0" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T20:09:29Z","lastTransitionTime":"2025-12-03T20:09:29Z","reason":"KubeletNotReady","message":"CSINode is not yet initialized"} Dec 03 20:09:29.084635 master-0 kubenswrapper[29252]: E1203 20:09:29.084541 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.185447 master-0 kubenswrapper[29252]: E1203 20:09:29.185282 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.286516 master-0 kubenswrapper[29252]: E1203 20:09:29.286241 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.386521 master-0 kubenswrapper[29252]: E1203 20:09:29.386444 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.487516 master-0 kubenswrapper[29252]: E1203 20:09:29.487423 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.588134 master-0 kubenswrapper[29252]: E1203 20:09:29.587971 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.689197 master-0 kubenswrapper[29252]: E1203 20:09:29.689057 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.789486 master-0 kubenswrapper[29252]: E1203 20:09:29.789369 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.890675 master-0 kubenswrapper[29252]: E1203 20:09:29.890580 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:29.991812 master-0 kubenswrapper[29252]: E1203 20:09:29.991140 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:30.091635 master-0 kubenswrapper[29252]: E1203 20:09:30.091568 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:30.192180 master-0 kubenswrapper[29252]: E1203 20:09:30.192055 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:30.293128 master-0 kubenswrapper[29252]: E1203 20:09:30.293053 29252 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 20:09:30.358801 master-0 kubenswrapper[29252]: I1203 20:09:30.358729 29252 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 20:09:30.360193 master-0 kubenswrapper[29252]: I1203 20:09:30.360148 29252 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 20:09:30.382482 master-0 kubenswrapper[29252]: I1203 20:09:30.382333 29252 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 20:09:30.414166 master-0 kubenswrapper[29252]: I1203 20:09:30.414078 29252 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 20:09:30.418291 master-0 kubenswrapper[29252]: I1203 20:09:30.418247 29252 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 20:09:30.499381 master-0 kubenswrapper[29252]: I1203 20:09:30.499255 29252 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Dec 03 20:09:30.499561 master-0 kubenswrapper[29252]: I1203 20:09:30.499393 29252 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 03 20:09:30.514666 master-0 kubenswrapper[29252]: I1203 20:09:30.514487 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.514666 master-0 kubenswrapper[29252]: I1203 20:09:30.514548 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.514666 master-0 kubenswrapper[29252]: I1203 20:09:30.514572 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.514666 master-0 kubenswrapper[29252]: I1203 20:09:30.514601 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.514666 master-0 kubenswrapper[29252]: I1203 20:09:30.514624 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514691 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514741 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514762 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514794 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514816 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514833 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514851 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514869 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514883 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514901 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514916 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514930 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514948 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514962 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.515021 master-0 kubenswrapper[29252]: I1203 20:09:30.514977 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627555 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627633 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627667 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627699 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627741 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627821 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627860 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627889 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.627921 master-0 kubenswrapper[29252]: I1203 20:09:30.627927 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.627966 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628004 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628041 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628109 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628144 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628173 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628204 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628232 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628266 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628303 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628339 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628413 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.628681 master-0 kubenswrapper[29252]: I1203 20:09:30.628531 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.628736 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.628868 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.628943 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.628999 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629055 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629115 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629175 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629231 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629288 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629347 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:30.629419 master-0 kubenswrapper[29252]: I1203 20:09:30.629404 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629461 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629517 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629571 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629624 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629678 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629735 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.630423 master-0 kubenswrapper[29252]: I1203 20:09:30.629824 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:30.637406 master-0 kubenswrapper[29252]: I1203 20:09:30.636750 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.637406 master-0 kubenswrapper[29252]: I1203 20:09:30.637153 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.652855 master-0 kubenswrapper[29252]: I1203 20:09:30.652060 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:30.929484 master-0 kubenswrapper[29252]: I1203 20:09:30.929431 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 03 20:09:31.112911 master-0 kubenswrapper[29252]: I1203 20:09:31.112855 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 03 20:09:31.356502 master-0 kubenswrapper[29252]: I1203 20:09:31.356435 29252 apiserver.go:52] "Watching apiserver" Dec 03 20:09:31.376881 master-0 kubenswrapper[29252]: I1203 20:09:31.376737 29252 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 20:09:31.380288 master-0 kubenswrapper[29252]: I1203 20:09:31.378444 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dbfhg","openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p","openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n","openshift-ovn-kubernetes/ovnkube-node-l9m2r","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75","openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv","openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh","openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn","openshift-machine-config-operator/machine-config-daemon-7t8bs","openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj","assisted-installer/assisted-installer-controller-ljsns","openshift-kube-apiserver/installer-1-master-0","openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw","openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt","openshift-apiserver/apiserver-b46c54696-bgb45","openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p","openshift-dns/node-resolver-hk22l","openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz","openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf","openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj","openshift-kube-scheduler/installer-4-master-0","openshift-machine-api/machine-api-operator-7486ff55f-9p9rq","openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6","openshift-multus/multus-p9sdj","openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg","openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr","openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7","openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2","openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j","openshift-marketplace/redhat-marketplace-wcnrx","openshift-multus/network-metrics-daemon-hs6gf","openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q","openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5","openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv","openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx","openshift-multus/multus-additional-cni-plugins-pwlw2","openshift-network-operator/iptables-alerter-72rrb","openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j","openshift-etcd/installer-2-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-controller-manager/installer-2-retry-1-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx","openshift-service-ca/service-ca-6b8bb995f7-bj4vz","openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg","openshift-marketplace/community-operators-98lh5","openshift-network-node-identity/network-node-identity-r2kpn","openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb","openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl","openshift-etcd/etcd-master-0","openshift-controller-manager/controller-manager-ff788744d-hkt6c","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/redhat-operators-9smb5","openshift-network-operator/network-operator-6cbf58c977-w7d8t","openshift-cluster-node-tuning-operator/tuned-l789w","openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd","openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb","openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc","openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf","openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4","openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch","openshift-insights/insights-operator-59d99f9b7b-h64kt","openshift-marketplace/certified-operators-mg96g","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/installer-2-master-0","openshift-network-diagnostics/network-check-target-x6vwd","openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb","openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5"] Dec 03 20:09:31.380288 master-0 kubenswrapper[29252]: I1203 20:09:31.378754 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-ljsns" Dec 03 20:09:31.383497 master-0 kubenswrapper[29252]: I1203 20:09:31.383447 29252 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="6e68aef7-c088-485c-8d5f-0a681bed67ae" Dec 03 20:09:31.384358 master-0 kubenswrapper[29252]: I1203 20:09:31.384327 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 20:09:31.384601 master-0 kubenswrapper[29252]: I1203 20:09:31.384577 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 20:09:31.384830 master-0 kubenswrapper[29252]: I1203 20:09:31.384808 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.384912 master-0 kubenswrapper[29252]: I1203 20:09:31.384893 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 20:09:31.385039 master-0 kubenswrapper[29252]: I1203 20:09:31.384818 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 20:09:31.393857 master-0 kubenswrapper[29252]: I1203 20:09:31.393341 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 20:09:31.402037 master-0 kubenswrapper[29252]: I1203 20:09:31.401975 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 20:09:31.402037 master-0 kubenswrapper[29252]: I1203 20:09:31.402019 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402039 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402058 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402090 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402101 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402144 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402151 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402169 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 20:09:31.402213 master-0 kubenswrapper[29252]: I1203 20:09:31.402192 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402058 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402056 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402270 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402190 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402119 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402329 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.401998 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402110 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402287 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402404 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402424 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402438 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402456 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402466 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402477 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402514 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402534 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 20:09:31.402544 master-0 kubenswrapper[29252]: I1203 20:09:31.402540 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402573 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402578 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402588 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402620 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402311 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402637 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402678 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402688 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402698 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402706 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402704 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402358 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402769 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.402771 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 20:09:31.403221 master-0 kubenswrapper[29252]: I1203 20:09:31.403071 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 20:09:31.403664 master-0 kubenswrapper[29252]: I1203 20:09:31.403348 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 20:09:31.403664 master-0 kubenswrapper[29252]: I1203 20:09:31.403485 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 20:09:31.404135 master-0 kubenswrapper[29252]: I1203 20:09:31.404074 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 20:09:31.404308 master-0 kubenswrapper[29252]: I1203 20:09:31.404276 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 20:09:31.404556 master-0 kubenswrapper[29252]: I1203 20:09:31.404474 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 20:09:31.404869 master-0 kubenswrapper[29252]: I1203 20:09:31.404833 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 20:09:31.404946 master-0 kubenswrapper[29252]: I1203 20:09:31.404926 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.405080 master-0 kubenswrapper[29252]: I1203 20:09:31.405058 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 20:09:31.405161 master-0 kubenswrapper[29252]: I1203 20:09:31.405062 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 20:09:31.406297 master-0 kubenswrapper[29252]: I1203 20:09:31.406260 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 20:09:31.408294 master-0 kubenswrapper[29252]: I1203 20:09:31.408258 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 20:09:31.408765 master-0 kubenswrapper[29252]: I1203 20:09:31.408715 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 20:09:31.410298 master-0 kubenswrapper[29252]: I1203 20:09:31.409940 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 20:09:31.410298 master-0 kubenswrapper[29252]: I1203 20:09:31.410153 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 20:09:31.410298 master-0 kubenswrapper[29252]: I1203 20:09:31.410170 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 20:09:31.410298 master-0 kubenswrapper[29252]: I1203 20:09:31.410161 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 20:09:31.410492 master-0 kubenswrapper[29252]: I1203 20:09:31.410332 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 20:09:31.410492 master-0 kubenswrapper[29252]: I1203 20:09:31.410355 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 20:09:31.410492 master-0 kubenswrapper[29252]: I1203 20:09:31.410364 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 20:09:31.410492 master-0 kubenswrapper[29252]: I1203 20:09:31.410421 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 20:09:31.410645 master-0 kubenswrapper[29252]: I1203 20:09:31.410534 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 20:09:31.410901 master-0 kubenswrapper[29252]: I1203 20:09:31.410881 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.411496 master-0 kubenswrapper[29252]: I1203 20:09:31.411133 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 20:09:31.412387 master-0 kubenswrapper[29252]: I1203 20:09:31.412357 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.412454 master-0 kubenswrapper[29252]: I1203 20:09:31.412430 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 20:09:31.412520 master-0 kubenswrapper[29252]: I1203 20:09:31.412476 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 20:09:31.412576 master-0 kubenswrapper[29252]: I1203 20:09:31.412545 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 20:09:31.413044 master-0 kubenswrapper[29252]: I1203 20:09:31.412819 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:09:31.413044 master-0 kubenswrapper[29252]: I1203 20:09:31.412834 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 20:09:31.413044 master-0 kubenswrapper[29252]: I1203 20:09:31.412941 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 20:09:31.413865 master-0 kubenswrapper[29252]: I1203 20:09:31.413837 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.415072 master-0 kubenswrapper[29252]: I1203 20:09:31.414485 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 20:09:31.415072 master-0 kubenswrapper[29252]: I1203 20:09:31.414852 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 20:09:31.415072 master-0 kubenswrapper[29252]: I1203 20:09:31.414893 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 20:09:31.415072 master-0 kubenswrapper[29252]: I1203 20:09:31.414911 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 20:09:31.415072 master-0 kubenswrapper[29252]: I1203 20:09:31.415007 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 20:09:31.415325 master-0 kubenswrapper[29252]: I1203 20:09:31.415253 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 20:09:31.418652 master-0 kubenswrapper[29252]: I1203 20:09:31.415448 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 20:09:31.418652 master-0 kubenswrapper[29252]: I1203 20:09:31.417652 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 20:09:31.418652 master-0 kubenswrapper[29252]: I1203 20:09:31.417757 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 20:09:31.418652 master-0 kubenswrapper[29252]: I1203 20:09:31.418102 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 20:09:31.418652 master-0 kubenswrapper[29252]: I1203 20:09:31.418282 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 20:09:31.419129 master-0 kubenswrapper[29252]: I1203 20:09:31.419104 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 20:09:31.419129 master-0 kubenswrapper[29252]: I1203 20:09:31.419106 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 20:09:31.419380 master-0 kubenswrapper[29252]: I1203 20:09:31.419346 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 20:09:31.419426 master-0 kubenswrapper[29252]: I1203 20:09:31.419387 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 20:09:31.419521 master-0 kubenswrapper[29252]: I1203 20:09:31.419494 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 20:09:31.419686 master-0 kubenswrapper[29252]: I1203 20:09:31.419673 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 20:09:31.419967 master-0 kubenswrapper[29252]: I1203 20:09:31.419947 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 20:09:31.420043 master-0 kubenswrapper[29252]: I1203 20:09:31.419968 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 20:09:31.420043 master-0 kubenswrapper[29252]: I1203 20:09:31.420022 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 20:09:31.420137 master-0 kubenswrapper[29252]: I1203 20:09:31.420063 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 20:09:31.420190 master-0 kubenswrapper[29252]: I1203 20:09:31.420181 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 20:09:31.420264 master-0 kubenswrapper[29252]: I1203 20:09:31.420236 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 20:09:31.422835 master-0 kubenswrapper[29252]: I1203 20:09:31.422811 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Dec 03 20:09:31.423354 master-0 kubenswrapper[29252]: I1203 20:09:31.423322 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.429035 master-0 kubenswrapper[29252]: I1203 20:09:31.429000 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 20:09:31.429588 master-0 kubenswrapper[29252]: I1203 20:09:31.429539 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 20:09:31.431838 master-0 kubenswrapper[29252]: I1203 20:09:31.431797 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:31.431920 master-0 kubenswrapper[29252]: I1203 20:09:31.431847 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:31.431920 master-0 kubenswrapper[29252]: I1203 20:09:31.431878 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:31.431920 master-0 kubenswrapper[29252]: I1203 20:09:31.431908 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:31.432041 master-0 kubenswrapper[29252]: I1203 20:09:31.431935 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432041 master-0 kubenswrapper[29252]: I1203 20:09:31.431982 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:31.432041 master-0 kubenswrapper[29252]: I1203 20:09:31.432012 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432038 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432062 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432089 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-config\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432105 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432126 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:31.432160 master-0 kubenswrapper[29252]: I1203 20:09:31.432146 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432164 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432182 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432202 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432219 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432238 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432257 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432278 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432298 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432319 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432335 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432372 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432387 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:31.432404 master-0 kubenswrapper[29252]: I1203 20:09:31.432406 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432421 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432436 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432452 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432457 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-serving-cert\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432480 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba68608f-6b36-455e-b80b-d19237df9312-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432689 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-serving-cert\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.432932 master-0 kubenswrapper[29252]: I1203 20:09:31.432848 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e2c94f-f9e9-415b-a550-3006a4632ba4-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:31.433136 master-0 kubenswrapper[29252]: I1203 20:09:31.432791 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-config\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:31.433136 master-0 kubenswrapper[29252]: I1203 20:09:31.433014 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e4f88-7106-4a46-8b63-053345922fb0-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:31.433136 master-0 kubenswrapper[29252]: I1203 20:09:31.433024 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.433136 master-0 kubenswrapper[29252]: I1203 20:09:31.433065 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f99422-7991-40ef-92a1-de2e603e47b9-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:31.433248 master-0 kubenswrapper[29252]: I1203 20:09:31.433189 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-srv-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:31.433248 master-0 kubenswrapper[29252]: I1203 20:09:31.433199 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba68608f-6b36-455e-b80b-d19237df9312-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:31.433248 master-0 kubenswrapper[29252]: I1203 20:09:31.433223 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6eb4700c-6af0-468b-afc8-1e09b902d6bf-metrics-tls\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.433334 master-0 kubenswrapper[29252]: I1203 20:09:31.433318 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.433365 master-0 kubenswrapper[29252]: I1203 20:09:31.433350 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a19b8f9e-6299-43bf-9aa5-22071b855773-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:31.433399 master-0 kubenswrapper[29252]: I1203 20:09:31.433388 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:31.433479 master-0 kubenswrapper[29252]: I1203 20:09:31.433450 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f9f99422-7991-40ef-92a1-de2e603e47b9-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:31.433523 master-0 kubenswrapper[29252]: I1203 20:09:31.433453 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-config\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:31.433523 master-0 kubenswrapper[29252]: I1203 20:09:31.433488 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e2c94f-f9e9-415b-a550-3006a4632ba4-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:31.434598 master-0 kubenswrapper[29252]: I1203 20:09:31.434552 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 20:09:31.441039 master-0 kubenswrapper[29252]: I1203 20:09:31.441011 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 20:09:31.461864 master-0 kubenswrapper[29252]: I1203 20:09:31.461811 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 20:09:31.480789 master-0 kubenswrapper[29252]: I1203 20:09:31.480726 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 20:09:31.492996 master-0 kubenswrapper[29252]: I1203 20:09:31.492954 29252 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 20:09:31.501968 master-0 kubenswrapper[29252]: I1203 20:09:31.501923 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 20:09:31.521208 master-0 kubenswrapper[29252]: I1203 20:09:31.521180 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 20:09:31.534067 master-0 kubenswrapper[29252]: I1203 20:09:31.534008 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:31.534067 master-0 kubenswrapper[29252]: I1203 20:09:31.534064 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534088 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534107 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d468\" (UniqueName: \"kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534123 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534139 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534154 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvlxr\" (UniqueName: \"kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 20:09:31.534169 master-0 kubenswrapper[29252]: I1203 20:09:31.534172 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534190 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534215 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534230 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534247 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsr6\" (UniqueName: \"kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534264 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534278 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534296 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:31.534345 master-0 kubenswrapper[29252]: I1203 20:09:31.534320 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534374 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534407 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8fx\" (UniqueName: \"kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534412 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-config\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534431 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534451 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534469 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534487 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534507 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjn9m\" (UniqueName: \"kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534525 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534540 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.534561 master-0 kubenswrapper[29252]: I1203 20:09:31.534557 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534574 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534593 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534628 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534646 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534668 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7t26\" (UniqueName: \"kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534687 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534702 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534719 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdd6z\" (UniqueName: \"kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534729 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534736 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534757 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534803 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534845 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534865 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:31.534884 master-0 kubenswrapper[29252]: I1203 20:09:31.534882 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534901 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534919 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534917 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d210062f-c07e-419f-a551-c37571565686-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534940 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kszjr\" (UniqueName: \"kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534960 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.534966 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-whereabouts-configmap\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535001 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535035 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535051 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535068 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535087 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535094 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-catalog-content\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535105 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535127 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv6b\" (UniqueName: \"kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535144 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535161 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535184 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535201 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535219 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535246 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535264 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535280 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.535286 master-0 kubenswrapper[29252]: I1203 20:09:31.535297 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535310 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535325 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535345 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535361 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535377 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535396 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535411 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-catalog-content\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535414 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535445 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535465 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535482 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535501 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535518 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535534 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535537 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5decce88-c71e-411c-87b5-a37dd0f77e7b-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535552 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535571 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535590 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535598 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-webhook-cert\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535609 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535681 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535714 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535734 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-binary-copy\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.535901 master-0 kubenswrapper[29252]: I1203 20:09:31.535763 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-tmpfs\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.535930 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.535953 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.535993 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-cabundle\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536244 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536305 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-env-overrides\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.535738 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536362 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536385 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536397 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-metrics-tls\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536402 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536433 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536453 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536471 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536509 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.536571 master-0 kubenswrapper[29252]: I1203 20:09:31.536524 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-catalog-content\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536630 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536656 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536672 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536692 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536709 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536726 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536741 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73b7027e-44f5-4c7b-9226-585a90530535-cache\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536824 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536867 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536894 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536918 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536938 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536951 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/943feb0d-7d31-446a-9100-dfc4ef013d12-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:31.537014 master-0 kubenswrapper[29252]: I1203 20:09:31.536962 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6skg\" (UniqueName: \"kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537025 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537068 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537109 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xcx\" (UniqueName: \"kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537146 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537236 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537275 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537310 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5wb\" (UniqueName: \"kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537322 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-trusted-ca\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537339 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537366 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537389 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537416 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537439 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537463 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537488 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grk2s\" (UniqueName: \"kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s\") pod \"migrator-5bcf58cf9c-h2w9j\" (UID: \"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537516 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537466 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-env-overrides\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537616 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537654 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphq2\" (UniqueName: \"kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537681 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537717 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537737 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537761 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537794 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537814 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537845 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537872 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537897 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537921 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:31.537971 master-0 kubenswrapper[29252]: I1203 20:09:31.537943 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcj7\" (UniqueName: \"kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.537989 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538016 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5h7\" (UniqueName: \"kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538040 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538068 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538092 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538119 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538122 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d5f33153-bff1-403f-ae17-b7e90500365d-srv-cert\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538145 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538168 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538173 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538191 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538204 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7ed25861-1328-45e7-922e-37588a0b019c-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538217 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.537956 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/943feb0d-7d31-446a-9100-dfc4ef013d12-config\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538253 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-client\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538258 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1f82c7a1-ec21-497d-86f2-562cafa7ace7-cache\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538306 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b638f207-31df-4298-8801-4da6031deefc-utilities\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538308 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-config\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538311 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538354 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-utilities\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538373 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538507 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538540 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538585 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-utilities\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538623 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538653 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538675 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a710102c-72fb-4d8d-ad99-71940368a09e-catalog-content\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538700 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538699 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa8efc0-4514-4a14-80f5-ab9eca53a127-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:31.538741 master-0 kubenswrapper[29252]: I1203 20:09:31.538714 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f618ea7-3ad7-4dce-b450-a8202285f312-ovnkube-script-lib\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538792 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538816 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538838 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538859 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538931 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.538951 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539014 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539045 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539125 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539171 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pf5q\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539270 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539291 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539316 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539335 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539430 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539474 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.539503 master-0 kubenswrapper[29252]: I1203 20:09:31.539502 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539534 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539579 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539597 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-tuned\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539609 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539646 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dpx\" (UniqueName: \"kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539657 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/af2023e1-9c7a-40af-a6bf-fba31c3565b1-snapshots\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539686 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539725 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539759 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb4700c-6af0-468b-afc8-1e09b902d6bf-host-etc-kube\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539763 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539835 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539871 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539888 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7171597-cb9a-451c-80a4-64cfccf885f0-tmp\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539909 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:31.539952 master-0 kubenswrapper[29252]: I1203 20:09:31.539940 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.539972 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540003 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540030 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdn5\" (UniqueName: \"kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540082 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540120 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d210062f-c07e-419f-a551-c37571565686-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540128 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540316 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540343 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.540383 master-0 kubenswrapper[29252]: I1203 20:09:31.540371 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540402 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540429 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540454 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540485 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540500 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540511 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540541 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540567 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvklf\" (UniqueName: \"kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540593 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540603 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/128ed384-7ab6-41b6-bf45-c8fda917d52f-metrics-tls\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 20:09:31.540621 master-0 kubenswrapper[29252]: I1203 20:09:31.540621 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.541052 master-0 kubenswrapper[29252]: I1203 20:09:31.540813 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f618ea7-3ad7-4dce-b450-a8202285f312-ovn-node-metrics-cert\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.541052 master-0 kubenswrapper[29252]: I1203 20:09:31.540988 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ed25861-1328-45e7-922e-37588a0b019c-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:31.541495 master-0 kubenswrapper[29252]: I1203 20:09:31.541455 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:31.541543 master-0 kubenswrapper[29252]: I1203 20:09:31.541502 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.541581 master-0 kubenswrapper[29252]: I1203 20:09:31.541553 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66sr\" (UniqueName: \"kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:31.541611 master-0 kubenswrapper[29252]: I1203 20:09:31.541581 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:31.541649 master-0 kubenswrapper[29252]: I1203 20:09:31.541608 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:31.541649 master-0 kubenswrapper[29252]: I1203 20:09:31.541635 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.541714 master-0 kubenswrapper[29252]: I1203 20:09:31.541659 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.541714 master-0 kubenswrapper[29252]: I1203 20:09:31.541687 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.541794 master-0 kubenswrapper[29252]: I1203 20:09:31.541720 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 20:09:31.541794 master-0 kubenswrapper[29252]: I1203 20:09:31.541745 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.541870 master-0 kubenswrapper[29252]: I1203 20:09:31.541795 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95zsj\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.541870 master-0 kubenswrapper[29252]: I1203 20:09:31.541824 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:31.541870 master-0 kubenswrapper[29252]: I1203 20:09:31.541796 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa8efc0-4514-4a14-80f5-ab9eca53a127-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:31.541870 master-0 kubenswrapper[29252]: I1203 20:09:31.541855 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.541873 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-config\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.541886 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.541928 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.541937 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cni-binary-copy\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.541954 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.542009 master-0 kubenswrapper[29252]: I1203 20:09:31.542000 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.542229 master-0 kubenswrapper[29252]: I1203 20:09:31.542023 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.542229 master-0 kubenswrapper[29252]: I1203 20:09:31.542037 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/371917da-b783-4acc-81af-1cfc903269f4-iptables-alerter-script\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.542305 master-0 kubenswrapper[29252]: I1203 20:09:31.542042 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.542349 master-0 kubenswrapper[29252]: I1203 20:09:31.542317 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmkc\" (UniqueName: \"kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:31.542349 master-0 kubenswrapper[29252]: I1203 20:09:31.542337 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.542428 master-0 kubenswrapper[29252]: I1203 20:09:31.542184 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b5d4d0-b841-4e87-84b4-85911ff04325-metrics-certs\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 20:09:31.542428 master-0 kubenswrapper[29252]: I1203 20:09:31.542250 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/78a864f2-934f-4197-9753-24c9bc7f1fca-etcd-service-ca\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.542428 master-0 kubenswrapper[29252]: I1203 20:09:31.542357 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.542428 master-0 kubenswrapper[29252]: I1203 20:09:31.542077 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87f1759a-7df4-442e-a22d-6de8d54be333-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.542571 master-0 kubenswrapper[29252]: I1203 20:09:31.542430 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.542571 master-0 kubenswrapper[29252]: I1203 20:09:31.542541 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.542649 master-0 kubenswrapper[29252]: I1203 20:09:31.542629 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:31.542709 master-0 kubenswrapper[29252]: I1203 20:09:31.542682 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 20:09:31.542752 master-0 kubenswrapper[29252]: I1203 20:09:31.542736 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:31.542813 master-0 kubenswrapper[29252]: I1203 20:09:31.542795 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:31.542862 master-0 kubenswrapper[29252]: I1203 20:09:31.542824 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:31.542862 master-0 kubenswrapper[29252]: I1203 20:09:31.542855 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.542935 master-0 kubenswrapper[29252]: I1203 20:09:31.542881 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.542935 master-0 kubenswrapper[29252]: I1203 20:09:31.542909 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.542991 master-0 kubenswrapper[29252]: I1203 20:09:31.542936 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.542991 master-0 kubenswrapper[29252]: I1203 20:09:31.542963 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:31.543113 master-0 kubenswrapper[29252]: I1203 20:09:31.542990 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:31.543113 master-0 kubenswrapper[29252]: I1203 20:09:31.543020 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:31.543113 master-0 kubenswrapper[29252]: I1203 20:09:31.543043 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.543113 master-0 kubenswrapper[29252]: I1203 20:09:31.543066 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.543217 master-0 kubenswrapper[29252]: I1203 20:09:31.543108 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcmd\" (UniqueName: \"kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.543217 master-0 kubenswrapper[29252]: I1203 20:09:31.543148 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:31.543217 master-0 kubenswrapper[29252]: I1203 20:09:31.543180 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhn4\" (UniqueName: \"kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.543217 master-0 kubenswrapper[29252]: I1203 20:09:31.543206 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:31.543328 master-0 kubenswrapper[29252]: I1203 20:09:31.543208 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5decce88-c71e-411c-87b5-a37dd0f77e7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:31.543328 master-0 kubenswrapper[29252]: I1203 20:09:31.543240 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzd2g\" (UniqueName: \"kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:31.543328 master-0 kubenswrapper[29252]: I1203 20:09:31.543255 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e3d36d-1676-4f90-ac9a-d85b861a4655-signing-key\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:31.543328 master-0 kubenswrapper[29252]: I1203 20:09:31.543293 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:31.543328 master-0 kubenswrapper[29252]: I1203 20:09:31.543316 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b673cb04-f6f0-4113-bdcd-d6685b942c9f-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:31.543454 master-0 kubenswrapper[29252]: I1203 20:09:31.543332 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.543454 master-0 kubenswrapper[29252]: I1203 20:09:31.543361 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:31.543454 master-0 kubenswrapper[29252]: I1203 20:09:31.543386 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.543454 master-0 kubenswrapper[29252]: I1203 20:09:31.543412 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.543454 master-0 kubenswrapper[29252]: I1203 20:09:31.543440 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.543590 master-0 kubenswrapper[29252]: I1203 20:09:31.543466 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.543590 master-0 kubenswrapper[29252]: I1203 20:09:31.543474 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb19329-c50c-4214-94c8-7e8771b99233-utilities\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:31.543590 master-0 kubenswrapper[29252]: I1203 20:09:31.543578 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a864f2-934f-4197-9753-24c9bc7f1fca-serving-cert\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:31.543680 master-0 kubenswrapper[29252]: I1203 20:09:31.543655 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-daemon-config\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.546526 master-0 kubenswrapper[29252]: I1203 20:09:31.546498 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-ovnkube-identity-cm\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:31.561534 master-0 kubenswrapper[29252]: I1203 20:09:31.561354 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 20:09:31.581434 master-0 kubenswrapper[29252]: I1203 20:09:31.581386 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 20:09:31.588289 master-0 kubenswrapper[29252]: I1203 20:09:31.588214 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-client\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.600531 master-0 kubenswrapper[29252]: I1203 20:09:31.600472 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 20:09:31.606951 master-0 kubenswrapper[29252]: I1203 20:09:31.606871 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-image-import-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.622130 master-0 kubenswrapper[29252]: I1203 20:09:31.622059 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 20:09:31.623131 master-0 kubenswrapper[29252]: I1203 20:09:31.623089 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-etcd-serving-ca\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.645394 master-0 kubenswrapper[29252]: I1203 20:09:31.645222 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.645394 master-0 kubenswrapper[29252]: I1203 20:09:31.645303 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.645394 master-0 kubenswrapper[29252]: I1203 20:09:31.645331 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645419 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645452 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645502 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645540 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645580 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645607 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645632 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645664 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645734 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645797 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.645761 master-0 kubenswrapper[29252]: I1203 20:09:31.645814 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.645870 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.645895 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.645932 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.645968 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.645990 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646022 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646048 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646064 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646083 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646111 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646126 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646147 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646178 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646195 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646253 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646285 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646306 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646340 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646356 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646392 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646439 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646460 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646475 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646496 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646517 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646538 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646571 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646609 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646665 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646694 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646718 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646840 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646871 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646896 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646935 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646957 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.646980 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647019 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647057 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647084 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647107 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647152 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647170 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647191 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647213 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647283 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647387 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-netns\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647432 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-cnibin\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647464 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647513 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-run\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647545 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647574 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-slash\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647613 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-containers\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647645 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b8709c6c-8729-4702-a3fb-35a072855096-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647687 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-var-lib-kubelet\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647743 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647792 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit-dir\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647825 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-systemd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647856 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/371917da-b783-4acc-81af-1cfc903269f4-host-slash\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647900 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-cnibin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647941 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-socket-dir-parent\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.647860 master-0 kubenswrapper[29252]: I1203 20:09:31.647971 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-kubelet\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648007 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648038 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-node-log\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648092 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-lib-modules\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648141 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648172 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-conf-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648203 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-etc-kubernetes\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648243 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-hosts-file\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648273 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-rootfs\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648303 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-system-cni-dir\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648335 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-k8s-cni-cncf-io\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648364 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-run-ovn-kubernetes\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648395 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-hostroot\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648424 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-bin\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648455 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-netns\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648484 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648524 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c593a75e-c2af-4419-94da-e0c9ff14c41f-node-pullsecrets\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648557 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-multus\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648593 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysctl-conf\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648623 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-cni-bin\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648664 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-sys\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648710 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-system-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648742 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-ovn\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648796 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-systemd\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648830 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-var-lib-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648877 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-multus-cni-dir\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648915 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/73b7027e-44f5-4c7b-9226-585a90530535-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.648956 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-modprobe-d\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649124 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-os-release\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649185 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-kubernetes\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649218 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-run-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649264 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87f1759a-7df4-442e-a22d-6de8d54be333-os-release\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649295 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649332 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-etc-sysconfig\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649370 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/90610a53-b590-491e-8014-f0704afdc6e1-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649404 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-systemd-units\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649433 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-var-lib-kubelet\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649461 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-host-run-multus-certs\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649496 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f96c70ce-314a-4919-91e9-cc776a620846-audit-dir\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649530 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7171597-cb9a-451c-80a4-64cfccf885f0-host\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649559 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-etc-openvswitch\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649595 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1f82c7a1-ec21-497d-86f2-562cafa7ace7-etc-docker\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649625 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-log-socket\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.652186 master-0 kubenswrapper[29252]: I1203 20:09:31.649655 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f618ea7-3ad7-4dce-b450-a8202285f312-host-cni-netd\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:31.661640 master-0 kubenswrapper[29252]: I1203 20:09:31.661448 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 20:09:31.665083 master-0 kubenswrapper[29252]: I1203 20:09:31.664970 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.673899 master-0 kubenswrapper[29252]: I1203 20:09:31.670018 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 20:09:31.677475 master-0 kubenswrapper[29252]: I1203 20:09:31.677406 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-trusted-ca-bundle\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.680276 master-0 kubenswrapper[29252]: I1203 20:09:31.679887 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:31.684861 master-0 kubenswrapper[29252]: I1203 20:09:31.684816 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 20:09:31.691363 master-0 kubenswrapper[29252]: I1203 20:09:31.691304 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-serving-cert\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.701075 master-0 kubenswrapper[29252]: I1203 20:09:31.701025 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 20:09:31.706515 master-0 kubenswrapper[29252]: I1203 20:09:31.706461 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c593a75e-c2af-4419-94da-e0c9ff14c41f-encryption-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.721457 master-0 kubenswrapper[29252]: I1203 20:09:31.721319 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 20:09:31.724919 master-0 kubenswrapper[29252]: I1203 20:09:31.724877 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-audit\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.740349 master-0 kubenswrapper[29252]: I1203 20:09:31.740220 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 20:09:31.748154 master-0 kubenswrapper[29252]: I1203 20:09:31.748101 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:09:31.748154 master-0 kubenswrapper[29252]: I1203 20:09:31.748151 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:09:31.748337 master-0 kubenswrapper[29252]: I1203 20:09:31.748240 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:31.748337 master-0 kubenswrapper[29252]: I1203 20:09:31.748286 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c593a75e-c2af-4419-94da-e0c9ff14c41f-config\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:31.748452 master-0 kubenswrapper[29252]: I1203 20:09:31.748351 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock" (OuterVolumeSpecName: "var-lock") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:09:31.749406 master-0 kubenswrapper[29252]: I1203 20:09:31.749370 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:31.749406 master-0 kubenswrapper[29252]: I1203 20:09:31.749395 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e73e6013-87fc-40e2-a573-39930828faa7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:09:31.761553 master-0 kubenswrapper[29252]: I1203 20:09:31.761511 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 20:09:31.781092 master-0 kubenswrapper[29252]: I1203 20:09:31.781040 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 20:09:31.800792 master-0 kubenswrapper[29252]: I1203 20:09:31.800740 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 20:09:31.810381 master-0 kubenswrapper[29252]: I1203 20:09:31.810329 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d196dca7-f940-4aa0-b20a-214d22b62db6-metrics-tls\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:31.821205 master-0 kubenswrapper[29252]: I1203 20:09:31.821169 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 20:09:31.829292 master-0 kubenswrapper[29252]: I1203 20:09:31.829246 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1f82c7a1-ec21-497d-86f2-562cafa7ace7-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.841287 master-0 kubenswrapper[29252]: I1203 20:09:31.841245 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 20:09:31.869525 master-0 kubenswrapper[29252]: I1203 20:09:31.869385 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 20:09:31.880928 master-0 kubenswrapper[29252]: I1203 20:09:31.880832 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 20:09:31.890278 master-0 kubenswrapper[29252]: I1203 20:09:31.890179 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d196dca7-f940-4aa0-b20a-214d22b62db6-config-volume\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:31.902719 master-0 kubenswrapper[29252]: I1203 20:09:31.902572 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 20:09:31.906805 master-0 kubenswrapper[29252]: I1203 20:09:31.906717 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-ca-certs\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:31.920420 master-0 kubenswrapper[29252]: I1203 20:09:31.920377 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 20:09:31.950283 master-0 kubenswrapper[29252]: I1203 20:09:31.950185 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 20:09:31.951831 master-0 kubenswrapper[29252]: I1203 20:09:31.951715 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:31.961333 master-0 kubenswrapper[29252]: I1203 20:09:31.961279 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 20:09:31.981252 master-0 kubenswrapper[29252]: I1203 20:09:31.980822 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 20:09:31.983083 master-0 kubenswrapper[29252]: I1203 20:09:31.983039 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-serving-cert\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.002468 master-0 kubenswrapper[29252]: I1203 20:09:32.002422 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 20:09:32.010883 master-0 kubenswrapper[29252]: I1203 20:09:32.010839 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-etcd-client\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.021011 master-0 kubenswrapper[29252]: I1203 20:09:32.020960 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 20:09:32.024057 master-0 kubenswrapper[29252]: I1203 20:09:32.024010 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f96c70ce-314a-4919-91e9-cc776a620846-encryption-config\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.040848 master-0 kubenswrapper[29252]: I1203 20:09:32.040797 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 20:09:32.060285 master-0 kubenswrapper[29252]: I1203 20:09:32.060249 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 20:09:32.065204 master-0 kubenswrapper[29252]: I1203 20:09:32.065155 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-trusted-ca-bundle\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.080985 master-0 kubenswrapper[29252]: I1203 20:09:32.080949 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 20:09:32.101353 master-0 kubenswrapper[29252]: I1203 20:09:32.101293 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 20:09:32.108963 master-0 kubenswrapper[29252]: I1203 20:09:32.108905 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-audit-policies\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.121478 master-0 kubenswrapper[29252]: I1203 20:09:32.121427 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 20:09:32.141635 master-0 kubenswrapper[29252]: I1203 20:09:32.141592 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 20:09:32.144165 master-0 kubenswrapper[29252]: I1203 20:09:32.144132 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8709c6c-8729-4702-a3fb-35a072855096-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:32.161344 master-0 kubenswrapper[29252]: I1203 20:09:32.160901 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 20:09:32.171122 master-0 kubenswrapper[29252]: I1203 20:09:32.171080 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8709c6c-8729-4702-a3fb-35a072855096-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:32.181735 master-0 kubenswrapper[29252]: I1203 20:09:32.181662 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 20:09:32.191170 master-0 kubenswrapper[29252]: I1203 20:09:32.191135 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f96c70ce-314a-4919-91e9-cc776a620846-etcd-serving-ca\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:32.201148 master-0 kubenswrapper[29252]: I1203 20:09:32.201110 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 20:09:32.208989 master-0 kubenswrapper[29252]: I1203 20:09:32.208953 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd35fc5f-07ab-4c66-9b80-33a598d417ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 20:09:32.221607 master-0 kubenswrapper[29252]: I1203 20:09:32.221567 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 20:09:32.240254 master-0 kubenswrapper[29252]: I1203 20:09:32.240220 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 20:09:32.261172 master-0 kubenswrapper[29252]: I1203 20:09:32.261123 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 20:09:32.269528 master-0 kubenswrapper[29252]: I1203 20:09:32.269496 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6404bbc7-8ca9-4f20-8ce7-40f855555160-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:32.286943 master-0 kubenswrapper[29252]: I1203 20:09:32.286828 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 20:09:32.293575 master-0 kubenswrapper[29252]: I1203 20:09:32.293528 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:32.300978 master-0 kubenswrapper[29252]: I1203 20:09:32.300953 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 20:09:32.331719 master-0 kubenswrapper[29252]: I1203 20:09:32.331654 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 20:09:32.333468 master-0 kubenswrapper[29252]: I1203 20:09:32.333428 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6404bbc7-8ca9-4f20-8ce7-40f855555160-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:32.340429 master-0 kubenswrapper[29252]: I1203 20:09:32.340377 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 20:09:32.360714 master-0 kubenswrapper[29252]: I1203 20:09:32.360667 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 20:09:32.368374 master-0 kubenswrapper[29252]: I1203 20:09:32.367984 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/433c3273-c99e-4d68-befc-06f92d2fc8d5-cert\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:32.380394 master-0 kubenswrapper[29252]: I1203 20:09:32.380334 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 20:09:32.386533 master-0 kubenswrapper[29252]: I1203 20:09:32.386481 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-config\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:32.401473 master-0 kubenswrapper[29252]: I1203 20:09:32.401389 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 20:09:32.419956 master-0 kubenswrapper[29252]: I1203 20:09:32.419760 29252 request.go:700] Waited for 1.00331441s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/secrets?fieldSelector=metadata.name%3Dsamples-operator-tls&limit=500&resourceVersion=0 Dec 03 20:09:32.421634 master-0 kubenswrapper[29252]: I1203 20:09:32.421584 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 20:09:32.425828 master-0 kubenswrapper[29252]: I1203 20:09:32.425757 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 20:09:32.441244 master-0 kubenswrapper[29252]: I1203 20:09:32.441179 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 20:09:32.462910 master-0 kubenswrapper[29252]: I1203 20:09:32.462594 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 20:09:32.464825 master-0 kubenswrapper[29252]: I1203 20:09:32.464767 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/433c3273-c99e-4d68-befc-06f92d2fc8d5-images\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:32.481732 master-0 kubenswrapper[29252]: I1203 20:09:32.481687 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 20:09:32.490926 master-0 kubenswrapper[29252]: I1203 20:09:32.490878 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f749c7f2-1fd7-4078-a92d-0ae5523998ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 20:09:32.500754 master-0 kubenswrapper[29252]: I1203 20:09:32.500714 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 20:09:32.502899 master-0 kubenswrapper[29252]: I1203 20:09:32.502863 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:32.504054 master-0 kubenswrapper[29252]: I1203 20:09:32.504011 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:32.521366 master-0 kubenswrapper[29252]: I1203 20:09:32.521298 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 20:09:32.535411 master-0 kubenswrapper[29252]: E1203 20:09:32.535360 29252 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535415 29252 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535443 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls podName:ad22d8ed-2476-441b-aa3b-a7845606b0ac nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.035423939 +0000 UTC m=+7.848968892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls") pod "machine-api-operator-7486ff55f-9p9rq" (UID: "ad22d8ed-2476-441b-aa3b-a7845606b0ac") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535526 29252 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535563 29252 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535540 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert podName:c52974d8-fbe6-444b-97ae-468482eebac8 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.035520091 +0000 UTC m=+7.849065054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert") pod "route-controller-manager-86dd7cbd76-jg7rj" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535611 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert podName:d7f613c6-77d6-4cf9-afa0-7c494dee2a8e nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.035597933 +0000 UTC m=+7.849142886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert") pod "packageserver-9b474c48f-lx8ch" (UID: "d7f613c6-77d6-4cf9-afa0-7c494dee2a8e") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.535727 master-0 kubenswrapper[29252]: E1203 20:09:32.535627 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images podName:90610a53-b590-491e-8014-f0704afdc6e1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.035619774 +0000 UTC m=+7.849164827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" (UID: "90610a53-b590-491e-8014-f0704afdc6e1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.536624 master-0 kubenswrapper[29252]: E1203 20:09:32.536583 29252 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.536624 master-0 kubenswrapper[29252]: E1203 20:09:32.536600 29252 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.536624 master-0 kubenswrapper[29252]: E1203 20:09:32.536586 29252 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.536852 master-0 kubenswrapper[29252]: E1203 20:09:32.536629 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config podName:1c22cb59-5083-4be6-9998-a9e67a2c20cd nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.036620308 +0000 UTC m=+7.850165361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config") pod "controller-manager-ff788744d-hkt6c" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.536852 master-0 kubenswrapper[29252]: E1203 20:09:32.536659 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert podName:af2023e1-9c7a-40af-a6bf-fba31c3565b1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.036651749 +0000 UTC m=+7.850196692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert") pod "insights-operator-59d99f9b7b-h64kt" (UID: "af2023e1-9c7a-40af-a6bf-fba31c3565b1") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.536852 master-0 kubenswrapper[29252]: E1203 20:09:32.536672 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls podName:8dbbb6f8-711c-49a0-bc36-fa5d50124bd8 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.036665089 +0000 UTC m=+7.850210042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls") pod "machine-config-operator-664c9d94c9-lt6dx" (UID: "8dbbb6f8-711c-49a0-bc36-fa5d50124bd8") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.537761 master-0 kubenswrapper[29252]: E1203 20:09:32.537719 29252 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.537870 master-0 kubenswrapper[29252]: E1203 20:09:32.537803 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config podName:c52974d8-fbe6-444b-97ae-468482eebac8 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.037767356 +0000 UTC m=+7.851312399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config") pod "route-controller-manager-86dd7cbd76-jg7rj" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.538934 master-0 kubenswrapper[29252]: E1203 20:09:32.538895 29252 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.538934 master-0 kubenswrapper[29252]: E1203 20:09:32.538911 29252 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.538934 master-0 kubenswrapper[29252]: E1203 20:09:32.538938 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert podName:1c22cb59-5083-4be6-9998-a9e67a2c20cd nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.038928774 +0000 UTC m=+7.852473737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert") pod "controller-manager-ff788744d-hkt6c" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.538948 29252 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.538962 29252 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.538982 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config podName:90610a53-b590-491e-8014-f0704afdc6e1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.038962714 +0000 UTC m=+7.852507697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" (UID: "90610a53-b590-491e-8014-f0704afdc6e1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.539006 29252 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.539016 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert podName:d7f613c6-77d6-4cf9-afa0-7c494dee2a8e nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.039009376 +0000 UTC m=+7.852554329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert") pod "packageserver-9b474c48f-lx8ch" (UID: "d7f613c6-77d6-4cf9-afa0-7c494dee2a8e") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.539048 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config podName:09f5df5c-fd9b-430d-aecc-242594b4aff1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.039026766 +0000 UTC m=+7.852571719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config") pod "machine-approver-cb84b9cdf-7wrpf" (UID: "09f5df5c-fd9b-430d-aecc-242594b4aff1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.539177 master-0 kubenswrapper[29252]: E1203 20:09:32.539065 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images podName:8dbbb6f8-711c-49a0-bc36-fa5d50124bd8 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.039059407 +0000 UTC m=+7.852604360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images") pod "machine-config-operator-664c9d94c9-lt6dx" (UID: "8dbbb6f8-711c-49a0-bc36-fa5d50124bd8") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540175 master-0 kubenswrapper[29252]: E1203 20:09:32.540132 29252 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540175 master-0 kubenswrapper[29252]: E1203 20:09:32.540170 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls podName:9891cf64-59e8-4d8d-94fe-17cfa4b18c1b nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040162663 +0000 UTC m=+7.853707716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls") pod "machine-config-daemon-7t8bs" (UID: "9891cf64-59e8-4d8d-94fe-17cfa4b18c1b") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540186 29252 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540207 29252 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540244 29252 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540215 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs podName:c3afc439-ccaa-4751-95a1-ac7557e326f0 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040206094 +0000 UTC m=+7.853751137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs") pod "multus-admission-controller-5bdcc987c4-s6wpc" (UID: "c3afc439-ccaa-4751-95a1-ac7557e326f0") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540284 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles podName:1c22cb59-5083-4be6-9998-a9e67a2c20cd nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040263706 +0000 UTC m=+7.853808689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles") pod "controller-manager-ff788744d-hkt6c" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540299 29252 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540310 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls podName:90610a53-b590-491e-8014-f0704afdc6e1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040298337 +0000 UTC m=+7.853843330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" (UID: "90610a53-b590-491e-8014-f0704afdc6e1") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540290 29252 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540332 master-0 kubenswrapper[29252]: E1203 20:09:32.540336 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images podName:ad22d8ed-2476-441b-aa3b-a7845606b0ac nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040325677 +0000 UTC m=+7.853870670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images") pod "machine-api-operator-7486ff55f-9p9rq" (UID: "ad22d8ed-2476-441b-aa3b-a7845606b0ac") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.540837 master-0 kubenswrapper[29252]: E1203 20:09:32.540366 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle podName:af2023e1-9c7a-40af-a6bf-fba31c3565b1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.040356228 +0000 UTC m=+7.853901211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle") pod "insights-operator-59d99f9b7b-h64kt" (UID: "af2023e1-9c7a-40af-a6bf-fba31c3565b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541473 master-0 kubenswrapper[29252]: E1203 20:09:32.541432 29252 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541473 master-0 kubenswrapper[29252]: E1203 20:09:32.541468 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config podName:b2021db5-b27a-4e06-beec-d9ba82aa1ffc nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.041460365 +0000 UTC m=+7.855005318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config") pod "cluster-autoscaler-operator-7f88444875-kqfs4" (UID: "b2021db5-b27a-4e06-beec-d9ba82aa1ffc") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541608 master-0 kubenswrapper[29252]: E1203 20:09:32.541490 29252 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541608 master-0 kubenswrapper[29252]: E1203 20:09:32.541497 29252 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541608 master-0 kubenswrapper[29252]: E1203 20:09:32.541540 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config podName:09f5df5c-fd9b-430d-aecc-242594b4aff1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.041525606 +0000 UTC m=+7.855070569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config") pod "machine-approver-cb84b9cdf-7wrpf" (UID: "09f5df5c-fd9b-430d-aecc-242594b4aff1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541608 master-0 kubenswrapper[29252]: E1203 20:09:32.541559 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca podName:c52974d8-fbe6-444b-97ae-468482eebac8 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.041550247 +0000 UTC m=+7.855095210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca") pod "route-controller-manager-86dd7cbd76-jg7rj" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.541608 master-0 kubenswrapper[29252]: E1203 20:09:32.541568 29252 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.542059 master-0 kubenswrapper[29252]: E1203 20:09:32.541648 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca podName:1c22cb59-5083-4be6-9998-a9e67a2c20cd nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.041627678 +0000 UTC m=+7.855172691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca") pod "controller-manager-ff788744d-hkt6c" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.542059 master-0 kubenswrapper[29252]: I1203 20:09:32.541883 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 20:09:32.542059 master-0 kubenswrapper[29252]: E1203 20:09:32.542025 29252 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.542234 master-0 kubenswrapper[29252]: E1203 20:09:32.542074 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config podName:ad22d8ed-2476-441b-aa3b-a7845606b0ac nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.042064409 +0000 UTC m=+7.855609372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config") pod "machine-api-operator-7486ff55f-9p9rq" (UID: "ad22d8ed-2476-441b-aa3b-a7845606b0ac") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.543263 master-0 kubenswrapper[29252]: E1203 20:09:32.543224 29252 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.543263 master-0 kubenswrapper[29252]: E1203 20:09:32.543257 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls podName:09f5df5c-fd9b-430d-aecc-242594b4aff1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.043250027 +0000 UTC m=+7.856794980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls") pod "machine-approver-cb84b9cdf-7wrpf" (UID: "09f5df5c-fd9b-430d-aecc-242594b4aff1") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.543436 master-0 kubenswrapper[29252]: E1203 20:09:32.543306 29252 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.543436 master-0 kubenswrapper[29252]: E1203 20:09:32.543373 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle podName:af2023e1-9c7a-40af-a6bf-fba31c3565b1 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.04335781 +0000 UTC m=+7.856902793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle") pod "insights-operator-59d99f9b7b-h64kt" (UID: "af2023e1-9c7a-40af-a6bf-fba31c3565b1") : failed to sync configmap cache: timed out waiting for the condition Dec 03 20:09:32.544417 master-0 kubenswrapper[29252]: E1203 20:09:32.544372 29252 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.544417 master-0 kubenswrapper[29252]: E1203 20:09:32.544414 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert podName:b2021db5-b27a-4e06-beec-d9ba82aa1ffc nodeName:}" failed. No retries permitted until 2025-12-03 20:09:33.044405825 +0000 UTC m=+7.857950778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert") pod "cluster-autoscaler-operator-7f88444875-kqfs4" (UID: "b2021db5-b27a-4e06-beec-d9ba82aa1ffc") : failed to sync secret cache: timed out waiting for the condition Dec 03 20:09:32.562215 master-0 kubenswrapper[29252]: I1203 20:09:32.562159 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 20:09:32.581619 master-0 kubenswrapper[29252]: I1203 20:09:32.581569 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 20:09:32.601723 master-0 kubenswrapper[29252]: I1203 20:09:32.601646 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 20:09:32.621047 master-0 kubenswrapper[29252]: I1203 20:09:32.620974 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 20:09:32.641428 master-0 kubenswrapper[29252]: I1203 20:09:32.641372 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 20:09:32.662562 master-0 kubenswrapper[29252]: I1203 20:09:32.662468 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 20:09:32.671807 master-0 kubenswrapper[29252]: I1203 20:09:32.671657 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-check-endpoints/1.log" Dec 03 20:09:32.673192 master-0 kubenswrapper[29252]: I1203 20:09:32.673136 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="ee2aaab9b8550f344df3e7445ae5d2dcb743224979d469841109025bb15970fd" exitCode=255 Dec 03 20:09:32.673275 master-0 kubenswrapper[29252]: I1203 20:09:32.673238 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:32.685539 master-0 kubenswrapper[29252]: I1203 20:09:32.685478 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 20:09:32.700870 master-0 kubenswrapper[29252]: I1203 20:09:32.700830 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 20:09:32.721413 master-0 kubenswrapper[29252]: I1203 20:09:32.721278 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 20:09:32.746004 master-0 kubenswrapper[29252]: I1203 20:09:32.745951 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 20:09:32.761923 master-0 kubenswrapper[29252]: I1203 20:09:32.761867 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 20:09:32.781148 master-0 kubenswrapper[29252]: I1203 20:09:32.781105 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 20:09:32.800970 master-0 kubenswrapper[29252]: I1203 20:09:32.800915 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 20:09:32.821798 master-0 kubenswrapper[29252]: I1203 20:09:32.821719 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 20:09:32.841224 master-0 kubenswrapper[29252]: I1203 20:09:32.841166 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 20:09:32.862042 master-0 kubenswrapper[29252]: I1203 20:09:32.861976 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 20:09:32.881323 master-0 kubenswrapper[29252]: I1203 20:09:32.881274 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 20:09:32.902122 master-0 kubenswrapper[29252]: I1203 20:09:32.902032 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 20:09:32.921520 master-0 kubenswrapper[29252]: I1203 20:09:32.921184 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 20:09:32.941308 master-0 kubenswrapper[29252]: I1203 20:09:32.940953 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 20:09:32.962213 master-0 kubenswrapper[29252]: I1203 20:09:32.962119 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 20:09:32.983084 master-0 kubenswrapper[29252]: I1203 20:09:32.982962 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 20:09:33.002713 master-0 kubenswrapper[29252]: I1203 20:09:33.002621 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 20:09:33.021550 master-0 kubenswrapper[29252]: I1203 20:09:33.021483 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 20:09:33.042645 master-0 kubenswrapper[29252]: I1203 20:09:33.042585 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 20:09:33.062375 master-0 kubenswrapper[29252]: I1203 20:09:33.062059 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 20:09:33.076151 master-0 kubenswrapper[29252]: I1203 20:09:33.076093 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:33.076588 master-0 kubenswrapper[29252]: I1203 20:09:33.076545 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.076851 master-0 kubenswrapper[29252]: I1203 20:09:33.076816 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.077255 master-0 kubenswrapper[29252]: I1203 20:09:33.077213 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:33.077576 master-0 kubenswrapper[29252]: I1203 20:09:33.077543 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.077955 master-0 kubenswrapper[29252]: I1203 20:09:33.077913 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:33.078346 master-0 kubenswrapper[29252]: I1203 20:09:33.076686 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-apiservice-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:33.078346 master-0 kubenswrapper[29252]: I1203 20:09:33.077748 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-proxy-tls\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:33.078346 master-0 kubenswrapper[29252]: I1203 20:09:33.077280 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af2023e1-9c7a-40af-a6bf-fba31c3565b1-serving-cert\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.078346 master-0 kubenswrapper[29252]: I1203 20:09:33.078225 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.078346 master-0 kubenswrapper[29252]: I1203 20:09:33.078332 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-images\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:33.078815 master-0 kubenswrapper[29252]: I1203 20:09:33.078420 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.079198 master-0 kubenswrapper[29252]: I1203 20:09:33.079125 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.079318 master-0 kubenswrapper[29252]: I1203 20:09:33.079222 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.079409 master-0 kubenswrapper[29252]: I1203 20:09:33.079317 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.079409 master-0 kubenswrapper[29252]: I1203 20:09:33.079377 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:33.079635 master-0 kubenswrapper[29252]: I1203 20:09:33.079603 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.079741 master-0 kubenswrapper[29252]: I1203 20:09:33.079641 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:33.079741 master-0 kubenswrapper[29252]: I1203 20:09:33.079690 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.079741 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.079829 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.079868 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.079925 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.079986 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080022 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080054 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080121 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-proxy-tls\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080130 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080291 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080260 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-webhook-cert\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080319 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080347 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080349 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-images\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080370 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080347 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad22d8ed-2476-441b-aa3b-a7845606b0ac-config\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080411 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080372 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080591 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080611 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af2023e1-9c7a-40af-a6bf-fba31c3565b1-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080669 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080723 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080755 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080843 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/09f5df5c-fd9b-430d-aecc-242594b4aff1-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080943 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.080968 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad22d8ed-2476-441b-aa3b-a7845606b0ac-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.081047 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.081054 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.081187 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:33.081445 master-0 kubenswrapper[29252]: I1203 20:09:33.081056 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-cert\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:33.090767 master-0 kubenswrapper[29252]: I1203 20:09:33.090699 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.101693 master-0 kubenswrapper[29252]: I1203 20:09:33.101636 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 20:09:33.111943 master-0 kubenswrapper[29252]: I1203 20:09:33.111647 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f5df5c-fd9b-430d-aecc-242594b4aff1-config\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:33.121568 master-0 kubenswrapper[29252]: I1203 20:09:33.121510 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 20:09:33.141597 master-0 kubenswrapper[29252]: I1203 20:09:33.141548 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 20:09:33.160902 master-0 kubenswrapper[29252]: I1203 20:09:33.160856 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 20:09:33.161377 master-0 kubenswrapper[29252]: I1203 20:09:33.161322 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.185552 master-0 kubenswrapper[29252]: I1203 20:09:33.181167 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 20:09:33.190650 master-0 kubenswrapper[29252]: I1203 20:09:33.190592 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/90610a53-b590-491e-8014-f0704afdc6e1-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.201379 master-0 kubenswrapper[29252]: I1203 20:09:33.201098 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 20:09:33.211168 master-0 kubenswrapper[29252]: I1203 20:09:33.211123 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/90610a53-b590-491e-8014-f0704afdc6e1-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:33.221880 master-0 kubenswrapper[29252]: I1203 20:09:33.221822 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:09:33.241582 master-0 kubenswrapper[29252]: I1203 20:09:33.241510 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:09:33.262723 master-0 kubenswrapper[29252]: I1203 20:09:33.262640 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-7tjv7" Dec 03 20:09:33.282276 master-0 kubenswrapper[29252]: I1203 20:09:33.282194 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-krxhq" Dec 03 20:09:33.302089 master-0 kubenswrapper[29252]: I1203 20:09:33.301846 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-l56l4" Dec 03 20:09:33.321076 master-0 kubenswrapper[29252]: I1203 20:09:33.321027 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ngglc" Dec 03 20:09:33.341909 master-0 kubenswrapper[29252]: I1203 20:09:33.341846 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 20:09:33.350255 master-0 kubenswrapper[29252]: I1203 20:09:33.350179 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3afc439-ccaa-4751-95a1-ac7557e326f0-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:09:33.361192 master-0 kubenswrapper[29252]: I1203 20:09:33.361046 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-ztlqb" Dec 03 20:09:33.392884 master-0 kubenswrapper[29252]: I1203 20:09:33.392825 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqtm\" (UniqueName: \"kubernetes.io/projected/01d51d9a-9beb-4357-9dc2-aeac210cd0c4-kube-api-access-6sqtm\") pod \"service-ca-operator-56f5898f45-v6rp5\" (UID: \"01d51d9a-9beb-4357-9dc2-aeac210cd0c4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-v6rp5" Dec 03 20:09:33.411271 master-0 kubenswrapper[29252]: I1203 20:09:33.411205 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfnp\" (UniqueName: \"kubernetes.io/projected/0d4e4f88-7106-4a46-8b63-053345922fb0-kube-api-access-crfnp\") pod \"package-server-manager-75b4d49d4c-pqz7q\" (UID: \"0d4e4f88-7106-4a46-8b63-053345922fb0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:33.433707 master-0 kubenswrapper[29252]: I1203 20:09:33.433657 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqnq\" (UniqueName: \"kubernetes.io/projected/11e2c94f-f9e9-415b-a550-3006a4632ba4-kube-api-access-pfqnq\") pod \"kube-storage-version-migrator-operator-67c4cff67d-p7xj5\" (UID: \"11e2c94f-f9e9-415b-a550-3006a4632ba4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-p7xj5" Dec 03 20:09:33.439942 master-0 kubenswrapper[29252]: I1203 20:09:33.439889 29252 request.go:700] Waited for 2.007330215s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Dec 03 20:09:33.457586 master-0 kubenswrapper[29252]: I1203 20:09:33.457458 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqph\" (UniqueName: \"kubernetes.io/projected/a185ee17-4b4b-4d20-a8ed-56a2a01f1807-kube-api-access-sxqph\") pod \"authentication-operator-7479ffdf48-mfwhz\" (UID: \"a185ee17-4b4b-4d20-a8ed-56a2a01f1807\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" Dec 03 20:09:33.473236 master-0 kubenswrapper[29252]: I1203 20:09:33.473185 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855t4\" (UniqueName: \"kubernetes.io/projected/ba68608f-6b36-455e-b80b-d19237df9312-kube-api-access-855t4\") pod \"cluster-monitoring-operator-69cc794c58-dhgcv\" (UID: \"ba68608f-6b36-455e-b80b-d19237df9312\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-dhgcv" Dec 03 20:09:33.480898 master-0 kubenswrapper[29252]: I1203 20:09:33.480661 29252 scope.go:117] "RemoveContainer" containerID="0b22734703d42f07c436963e348c3be11ab4f5053e6afed5996abb0dab7d690d" Dec 03 20:09:33.499673 master-0 kubenswrapper[29252]: I1203 20:09:33.499597 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4z4\" (UniqueName: \"kubernetes.io/projected/f9f99422-7991-40ef-92a1-de2e603e47b9-kube-api-access-pk4z4\") pod \"cluster-olm-operator-589f5cdc9d-4fzrl\" (UID: \"f9f99422-7991-40ef-92a1-de2e603e47b9\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-4fzrl" Dec 03 20:09:33.731350 master-0 kubenswrapper[29252]: I1203 20:09:33.730875 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsr6\" (UniqueName: \"kubernetes.io/projected/c3afc439-ccaa-4751-95a1-ac7557e326f0-kube-api-access-ljsr6\") pod \"multus-admission-controller-5bdcc987c4-s6wpc\" (UID: \"c3afc439-ccaa-4751-95a1-ac7557e326f0\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-s6wpc" Dec 03 20:09:33.732042 master-0 kubenswrapper[29252]: I1203 20:09:33.731984 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7t26\" (UniqueName: \"kubernetes.io/projected/d7f613c6-77d6-4cf9-afa0-7c494dee2a8e-kube-api-access-k7t26\") pod \"packageserver-9b474c48f-lx8ch\" (UID: \"d7f613c6-77d6-4cf9-afa0-7c494dee2a8e\") " pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:33.735865 master-0 kubenswrapper[29252]: I1203 20:09:33.735771 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8709c6c-8729-4702-a3fb-35a072855096-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-q5wsd\" (UID: \"b8709c6c-8729-4702-a3fb-35a072855096\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-q5wsd" Dec 03 20:09:33.737055 master-0 kubenswrapper[29252]: I1203 20:09:33.737018 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghnf\" (UniqueName: \"kubernetes.io/projected/a19b8f9e-6299-43bf-9aa5-22071b855773-kube-api-access-6ghnf\") pod \"olm-operator-76bd5d69c7-wg7fw\" (UID: \"a19b8f9e-6299-43bf-9aa5-22071b855773\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:33.737564 master-0 kubenswrapper[29252]: I1203 20:09:33.737487 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvlxr\" (UniqueName: \"kubernetes.io/projected/56e013ee-ea7a-4780-8986-a7fd1b5a3a3f-kube-api-access-vvlxr\") pod \"node-resolver-hk22l\" (UID: \"56e013ee-ea7a-4780-8986-a7fd1b5a3a3f\") " pod="openshift-dns/node-resolver-hk22l" Dec 03 20:09:33.738995 master-0 kubenswrapper[29252]: I1203 20:09:33.738063 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d468\" (UniqueName: \"kubernetes.io/projected/6404bbc7-8ca9-4f20-8ce7-40f855555160-kube-api-access-4d468\") pod \"cloud-credential-operator-7c4dc67499-lqdlr\" (UID: \"6404bbc7-8ca9-4f20-8ce7-40f855555160\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-lqdlr" Dec 03 20:09:33.738995 master-0 kubenswrapper[29252]: E1203 20:09:33.738767 29252 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.316s" Dec 03 20:09:33.738995 master-0 kubenswrapper[29252]: I1203 20:09:33.738866 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:33.740186 master-0 kubenswrapper[29252]: I1203 20:09:33.740133 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjn9m\" (UniqueName: \"kubernetes.io/projected/ad22d8ed-2476-441b-aa3b-a7845606b0ac-kube-api-access-xjn9m\") pod \"machine-api-operator-7486ff55f-9p9rq\" (UID: \"ad22d8ed-2476-441b-aa3b-a7845606b0ac\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-9p9rq" Dec 03 20:09:33.741941 master-0 kubenswrapper[29252]: I1203 20:09:33.741869 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nkb\" (UniqueName: \"kubernetes.io/projected/6eb4700c-6af0-468b-afc8-1e09b902d6bf-kube-api-access-w7nkb\") pod \"network-operator-6cbf58c977-w7d8t\" (UID: \"6eb4700c-6af0-468b-afc8-1e09b902d6bf\") " pod="openshift-network-operator/network-operator-6cbf58c977-w7d8t" Dec 03 20:09:33.746129 master-0 kubenswrapper[29252]: I1203 20:09:33.746041 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457ln\" (UniqueName: \"kubernetes.io/projected/d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f-kube-api-access-457ln\") pod \"openshift-apiserver-operator-667484ff5-lsltt\" (UID: \"d28fbd98-2f67-42f5-9e06-b2e27a4b2f4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-lsltt" Dec 03 20:09:33.746885 master-0 kubenswrapper[29252]: I1203 20:09:33.746832 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8fx\" (UniqueName: \"kubernetes.io/projected/d7171597-cb9a-451c-80a4-64cfccf885f0-kube-api-access-gs8fx\") pod \"tuned-l789w\" (UID: \"d7171597-cb9a-451c-80a4-64cfccf885f0\") " pod="openshift-cluster-node-tuning-operator/tuned-l789w" Dec 03 20:09:33.753007 master-0 kubenswrapper[29252]: I1203 20:09:33.752760 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 20:09:33.766200 master-0 kubenswrapper[29252]: I1203 20:09:33.766129 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhk4\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-kube-api-access-6bhk4\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:33.837227 master-0 kubenswrapper[29252]: I1203 20:09:33.837155 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdd6z\" (UniqueName: \"kubernetes.io/projected/af2023e1-9c7a-40af-a6bf-fba31c3565b1-kube-api-access-hdd6z\") pod \"insights-operator-59d99f9b7b-h64kt\" (UID: \"af2023e1-9c7a-40af-a6bf-fba31c3565b1\") " pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" Dec 03 20:09:33.837555 master-0 kubenswrapper[29252]: I1203 20:09:33.837360 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xk9\" (UniqueName: \"kubernetes.io/projected/d210062f-c07e-419f-a551-c37571565686-kube-api-access-v7xk9\") pod \"ovnkube-control-plane-f9f7f4946-9pdrg\" (UID: \"d210062f-c07e-419f-a551-c37571565686\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-9pdrg" Dec 03 20:09:33.840213 master-0 kubenswrapper[29252]: I1203 20:09:33.840168 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4v7k\" (UniqueName: \"kubernetes.io/projected/371917da-b783-4acc-81af-1cfc903269f4-kube-api-access-w4v7k\") pod \"iptables-alerter-72rrb\" (UID: \"371917da-b783-4acc-81af-1cfc903269f4\") " pod="openshift-network-operator/iptables-alerter-72rrb" Dec 03 20:09:33.849321 master-0 kubenswrapper[29252]: I1203 20:09:33.849277 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5j7\" (UniqueName: \"kubernetes.io/projected/367c2c7c-1fc8-4608-aa94-b64c6c70cc61-kube-api-access-hb5j7\") pod \"csi-snapshot-controller-86897dd478-s29k7\" (UID: \"367c2c7c-1fc8-4608-aa94-b64c6c70cc61\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-s29k7" Dec 03 20:09:33.851961 master-0 kubenswrapper[29252]: I1203 20:09:33.851924 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"controller-manager-ff788744d-hkt6c\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:33.883996 master-0 kubenswrapper[29252]: I1203 20:09:33.879076 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/128ed384-7ab6-41b6-bf45-c8fda917d52f-kube-api-access-7qrgh\") pod \"dns-operator-6b7bcd6566-4wcq2\" (UID: \"128ed384-7ab6-41b6-bf45-c8fda917d52f\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-4wcq2" Dec 03 20:09:33.901298 master-0 kubenswrapper[29252]: I1203 20:09:33.901239 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kszjr\" (UniqueName: \"kubernetes.io/projected/6bb19329-c50c-4214-94c8-7e8771b99233-kube-api-access-kszjr\") pod \"certified-operators-mg96g\" (UID: \"6bb19329-c50c-4214-94c8-7e8771b99233\") " pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:33.912399 master-0 kubenswrapper[29252]: I1203 20:09:33.912356 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bztz2\" (UniqueName: \"kubernetes.io/projected/a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6-kube-api-access-bztz2\") pod \"multus-p9sdj\" (UID: \"a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6\") " pod="openshift-multus/multus-p9sdj" Dec 03 20:09:34.008423 master-0 kubenswrapper[29252]: I1203 20:09:34.008324 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdw4\" (UniqueName: \"kubernetes.io/projected/d5f33153-bff1-403f-ae17-b7e90500365d-kube-api-access-5sdw4\") pod \"catalog-operator-7cf5cf757f-25z8n\" (UID: \"d5f33153-bff1-403f-ae17-b7e90500365d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:34.008423 master-0 kubenswrapper[29252]: I1203 20:09:34.008344 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvllg\" (UniqueName: \"kubernetes.io/projected/87f1759a-7df4-442e-a22d-6de8d54be333-kube-api-access-wvllg\") pod \"multus-additional-cni-plugins-pwlw2\" (UID: \"87f1759a-7df4-442e-a22d-6de8d54be333\") " pod="openshift-multus/multus-additional-cni-plugins-pwlw2" Dec 03 20:09:34.008614 master-0 kubenswrapper[29252]: I1203 20:09:34.008536 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsx8\" (UniqueName: \"kubernetes.io/projected/daa8efc0-4514-4a14-80f5-ab9eca53a127-kube-api-access-rbsx8\") pod \"openshift-controller-manager-operator-7c4697b5f5-8jzqh\" (UID: \"daa8efc0-4514-4a14-80f5-ab9eca53a127\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-8jzqh" Dec 03 20:09:34.014608 master-0 kubenswrapper[29252]: I1203 20:09:34.014572 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9qq\" (UniqueName: \"kubernetes.io/projected/2f618ea7-3ad7-4dce-b450-a8202285f312-kube-api-access-4c9qq\") pod \"ovnkube-node-l9m2r\" (UID: \"2f618ea7-3ad7-4dce-b450-a8202285f312\") " pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:34.024921 master-0 kubenswrapper[29252]: I1203 20:09:34.024858 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv6b\" (UniqueName: \"kubernetes.io/projected/b638f207-31df-4298-8801-4da6031deefc-kube-api-access-trv6b\") pod \"redhat-marketplace-wcnrx\" (UID: \"b638f207-31df-4298-8801-4da6031deefc\") " pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:34.037647 master-0 kubenswrapper[29252]: I1203 20:09:34.037600 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf-bound-sa-token\") pod \"ingress-operator-85dbd94574-l7bzj\" (UID: \"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" Dec 03 20:09:34.065482 master-0 kubenswrapper[29252]: I1203 20:09:34.065423 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6skg\" (UniqueName: \"kubernetes.io/projected/b2021db5-b27a-4e06-beec-d9ba82aa1ffc-kube-api-access-j6skg\") pod \"cluster-autoscaler-operator-7f88444875-kqfs4\" (UID: \"b2021db5-b27a-4e06-beec-d9ba82aa1ffc\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kqfs4" Dec 03 20:09:34.076873 master-0 kubenswrapper[29252]: I1203 20:09:34.076835 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5wb\" (UniqueName: \"kubernetes.io/projected/cd35fc5f-07ab-4c66-9b80-33a598d417ef-kube-api-access-qk5wb\") pod \"control-plane-machine-set-operator-66f4cc99d4-2llfg\" (UID: \"cd35fc5f-07ab-4c66-9b80-33a598d417ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-2llfg" Dec 03 20:09:34.089660 master-0 kubenswrapper[29252]: I1203 20:09:34.089599 29252 scope.go:117] "RemoveContainer" containerID="d77636ae6fa70a30480be55d0b3b081bbffecdd76b888e95fdd9a2954e04756e" Dec 03 20:09:34.095268 master-0 kubenswrapper[29252]: I1203 20:09:34.095227 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xcx\" (UniqueName: \"kubernetes.io/projected/c593a75e-c2af-4419-94da-e0c9ff14c41f-kube-api-access-j2xcx\") pod \"apiserver-b46c54696-bgb45\" (UID: \"c593a75e-c2af-4419-94da-e0c9ff14c41f\") " pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:34.119867 master-0 kubenswrapper[29252]: I1203 20:09:34.119830 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"route-controller-manager-86dd7cbd76-jg7rj\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:34.142518 master-0 kubenswrapper[29252]: I1203 20:09:34.142467 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5h7\" (UniqueName: \"kubernetes.io/projected/6a82ff78-4383-4ca8-8a72-98c2ee50ffe2-kube-api-access-dl5h7\") pod \"cluster-samples-operator-6d64b47964-h9nkv\" (UID: \"6a82ff78-4383-4ca8-8a72-98c2ee50ffe2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-h9nkv" Dec 03 20:09:34.169419 master-0 kubenswrapper[29252]: I1203 20:09:34.169345 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlw5\" (UniqueName: \"kubernetes.io/projected/09f5df5c-fd9b-430d-aecc-242594b4aff1-kube-api-access-twlw5\") pod \"machine-approver-cb84b9cdf-7wrpf\" (UID: \"09f5df5c-fd9b-430d-aecc-242594b4aff1\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-7wrpf" Dec 03 20:09:34.179070 master-0 kubenswrapper[29252]: I1203 20:09:34.179006 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphq2\" (UniqueName: \"kubernetes.io/projected/d196dca7-f940-4aa0-b20a-214d22b62db6-kube-api-access-tphq2\") pod \"dns-default-dbfhg\" (UID: \"d196dca7-f940-4aa0-b20a-214d22b62db6\") " pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:34.207227 master-0 kubenswrapper[29252]: I1203 20:09:34.207173 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grk2s\" (UniqueName: \"kubernetes.io/projected/2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9-kube-api-access-grk2s\") pod \"migrator-5bcf58cf9c-h2w9j\" (UID: \"2d43df9b-bb29-4581-8cd9-f3b9c0c0e4d9\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-h2w9j" Dec 03 20:09:34.219209 master-0 kubenswrapper[29252]: I1203 20:09:34.219148 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcj7\" (UniqueName: \"kubernetes.io/projected/433c3273-c99e-4d68-befc-06f92d2fc8d5-kube-api-access-xwcj7\") pod \"cluster-baremetal-operator-5fdc576499-q47xb\" (UID: \"433c3273-c99e-4d68-befc-06f92d2fc8d5\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q47xb" Dec 03 20:09:34.234845 master-0 kubenswrapper[29252]: I1203 20:09:34.234800 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwbs\" (UniqueName: \"kubernetes.io/projected/b84835e3-e8bc-4aa4-a8f3-f9be702a358a-kube-api-access-vtwbs\") pod \"csi-snapshot-controller-operator-7b795784b8-4gppw\" (UID: \"b84835e3-e8bc-4aa4-a8f3-f9be702a358a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-4gppw" Dec 03 20:09:34.255300 master-0 kubenswrapper[29252]: I1203 20:09:34.255253 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv24n\" (UniqueName: \"kubernetes.io/projected/7ed25861-1328-45e7-922e-37588a0b019c-kube-api-access-cv24n\") pod \"cluster-node-tuning-operator-bbd9b9dff-vqzdb\" (UID: \"7ed25861-1328-45e7-922e-37588a0b019c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-vqzdb" Dec 03 20:09:34.274373 master-0 kubenswrapper[29252]: I1203 20:09:34.274281 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c85\" (UniqueName: \"kubernetes.io/projected/46b5d4d0-b841-4e87-84b4-85911ff04325-kube-api-access-s2c85\") pod \"network-metrics-daemon-hs6gf\" (UID: \"46b5d4d0-b841-4e87-84b4-85911ff04325\") " pod="openshift-multus/network-metrics-daemon-hs6gf" Dec 03 20:09:34.298662 master-0 kubenswrapper[29252]: I1203 20:09:34.298610 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qch\" (UniqueName: \"kubernetes.io/projected/b673cb04-f6f0-4113-bdcd-d6685b942c9f-kube-api-access-m2qch\") pod \"marketplace-operator-7d67745bb7-xqvv6\" (UID: \"b673cb04-f6f0-4113-bdcd-d6685b942c9f\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:34.317860 master-0 kubenswrapper[29252]: I1203 20:09:34.317804 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:34.339631 master-0 kubenswrapper[29252]: I1203 20:09:34.337495 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhcd\" (UniqueName: \"kubernetes.io/projected/0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9-kube-api-access-qdhcd\") pod \"openshift-config-operator-68c95b6cf5-8xmrv\" (UID: \"0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:34.357612 master-0 kubenswrapper[29252]: I1203 20:09:34.357533 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pf5q\" (UniqueName: \"kubernetes.io/projected/73b7027e-44f5-4c7b-9226-585a90530535-kube-api-access-7pf5q\") pod \"operator-controller-controller-manager-5f78c89466-vkcnf\" (UID: \"73b7027e-44f5-4c7b-9226-585a90530535\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:34.439967 master-0 kubenswrapper[29252]: I1203 20:09:34.439905 29252 request.go:700] Waited for 2.898207442s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-service-ca/serviceaccounts/service-ca/token Dec 03 20:09:34.502172 master-0 kubenswrapper[29252]: I1203 20:09:34.502078 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvklf\" (UniqueName: \"kubernetes.io/projected/f749c7f2-1fd7-4078-a92d-0ae5523998ac-kube-api-access-lvklf\") pod \"cluster-storage-operator-f84784664-wnl8p\" (UID: \"f749c7f2-1fd7-4078-a92d-0ae5523998ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-wnl8p" Dec 03 20:09:34.506534 master-0 kubenswrapper[29252]: I1203 20:09:34.506466 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95zsj\" (UniqueName: \"kubernetes.io/projected/1f82c7a1-ec21-497d-86f2-562cafa7ace7-kube-api-access-95zsj\") pod \"catalogd-controller-manager-754cfd84-xfv5j\" (UID: \"1f82c7a1-ec21-497d-86f2-562cafa7ace7\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:34.508995 master-0 kubenswrapper[29252]: I1203 20:09:34.508944 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8x9\" (UniqueName: \"kubernetes.io/projected/5decce88-c71e-411c-87b5-a37dd0f77e7b-kube-api-access-mr8x9\") pod \"cluster-image-registry-operator-65dc4bcb88-59j4p\" (UID: \"5decce88-c71e-411c-87b5-a37dd0f77e7b\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-59j4p" Dec 03 20:09:34.510505 master-0 kubenswrapper[29252]: I1203 20:09:34.510456 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dpx\" (UniqueName: \"kubernetes.io/projected/9891cf64-59e8-4d8d-94fe-17cfa4b18c1b-kube-api-access-c5dpx\") pod \"machine-config-daemon-7t8bs\" (UID: \"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b\") " pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:09:34.514059 master-0 kubenswrapper[29252]: I1203 20:09:34.514006 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdn5\" (UniqueName: \"kubernetes.io/projected/cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2-kube-api-access-7bdn5\") pod \"community-operators-98lh5\" (UID: \"cf5d6b8a-9fd1-4bd3-8b74-2d634caf7db2\") " pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:34.518992 master-0 kubenswrapper[29252]: I1203 20:09:34.518936 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943feb0d-7d31-446a-9100-dfc4ef013d12-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-9t9fn\" (UID: \"943feb0d-7d31-446a-9100-dfc4ef013d12\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-9t9fn" Dec 03 20:09:34.526447 master-0 kubenswrapper[29252]: I1203 20:09:34.526269 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66sr\" (UniqueName: \"kubernetes.io/projected/63e3d36d-1676-4f90-ac9a-d85b861a4655-kube-api-access-x66sr\") pod \"service-ca-6b8bb995f7-bj4vz\" (UID: \"63e3d36d-1676-4f90-ac9a-d85b861a4655\") " pod="openshift-service-ca/service-ca-6b8bb995f7-bj4vz" Dec 03 20:09:34.698815 master-0 kubenswrapper[29252]: I1203 20:09:34.698710 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/3.log" Dec 03 20:09:34.701419 master-0 kubenswrapper[29252]: I1203 20:09:34.701369 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/5.log" Dec 03 20:09:34.778533 master-0 kubenswrapper[29252]: E1203 20:09:34.778351 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:34.778869 master-0 kubenswrapper[29252]: E1203 20:09:34.778548 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 20:09:34.779549 master-0 kubenswrapper[29252]: E1203 20:09:34.779499 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:09:34.784265 master-0 kubenswrapper[29252]: I1203 20:09:34.784198 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcmd\" (UniqueName: \"kubernetes.io/projected/90610a53-b590-491e-8014-f0704afdc6e1-kube-api-access-4wcmd\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75\" (UID: \"90610a53-b590-491e-8014-f0704afdc6e1\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75" Dec 03 20:09:34.785362 master-0 kubenswrapper[29252]: E1203 20:09:34.785311 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:34.785417 master-0 kubenswrapper[29252]: E1203 20:09:34.785364 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:34.785506 master-0 kubenswrapper[29252]: E1203 20:09:34.785471 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:35.28544019 +0000 UTC m=+10.098985183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:34.786300 master-0 kubenswrapper[29252]: I1203 20:09:34.786255 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhcw\" (UniqueName: \"kubernetes.io/projected/830d89af-1266-43ac-b113-990a28595f91-kube-api-access-lkhcw\") pod \"network-check-target-x6vwd\" (UID: \"830d89af-1266-43ac-b113-990a28595f91\") " pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 20:09:34.787905 master-0 kubenswrapper[29252]: I1203 20:09:34.787858 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-79ccj\" (UID: \"e90437d9-b34d-4e86-8fa4-c2b3f35d2cb3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-79ccj" Dec 03 20:09:34.788592 master-0 kubenswrapper[29252]: I1203 20:09:34.788554 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzd2g\" (UniqueName: \"kubernetes.io/projected/8dbbb6f8-711c-49a0-bc36-fa5d50124bd8-kube-api-access-qzd2g\") pod \"machine-config-operator-664c9d94c9-lt6dx\" (UID: \"8dbbb6f8-711c-49a0-bc36-fa5d50124bd8\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-lt6dx" Dec 03 20:09:34.788794 master-0 kubenswrapper[29252]: I1203 20:09:34.788729 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3ee9a2-0f17-4a04-9191-b60684ef6c29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-j2wgx\" (UID: \"5b3ee9a2-0f17-4a04-9191-b60684ef6c29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-j2wgx" Dec 03 20:09:34.792756 master-0 kubenswrapper[29252]: I1203 20:09:34.792709 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmkc\" (UniqueName: \"kubernetes.io/projected/a710102c-72fb-4d8d-ad99-71940368a09e-kube-api-access-zgmkc\") pod \"redhat-operators-9smb5\" (UID: \"a710102c-72fb-4d8d-ad99-71940368a09e\") " pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:34.793607 master-0 kubenswrapper[29252]: I1203 20:09:34.793554 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhn4\" (UniqueName: \"kubernetes.io/projected/f96c70ce-314a-4919-91e9-cc776a620846-kube-api-access-lkhn4\") pod \"apiserver-597ff7d589-qjxsb\" (UID: \"f96c70ce-314a-4919-91e9-cc776a620846\") " pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:34.798975 master-0 kubenswrapper[29252]: I1203 20:09:34.798926 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59d2r\" (UniqueName: \"kubernetes.io/projected/78a864f2-934f-4197-9753-24c9bc7f1fca-kube-api-access-59d2r\") pod \"etcd-operator-7978bf889c-mqpzf\" (UID: \"78a864f2-934f-4197-9753-24c9bc7f1fca\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-mqpzf" Dec 03 20:09:34.811936 master-0 kubenswrapper[29252]: I1203 20:09:34.811880 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhg82\" (UniqueName: \"kubernetes.io/projected/c4d45235-fb1a-4626-a41e-b1e34f7bf76e-kube-api-access-qhg82\") pod \"network-node-identity-r2kpn\" (UID: \"c4d45235-fb1a-4626-a41e-b1e34f7bf76e\") " pod="openshift-network-node-identity/network-node-identity-r2kpn" Dec 03 20:09:34.843744 master-0 kubenswrapper[29252]: E1203 20:09:34.843682 29252 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.105s" Dec 03 20:09:34.843744 master-0 kubenswrapper[29252]: I1203 20:09:34.843737 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 20:09:34.843744 master-0 kubenswrapper[29252]: I1203 20:09:34.843753 29252 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="6e68aef7-c088-485c-8d5f-0a681bed67ae" Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.843893 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.843955 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.843991 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.844007 29252 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="6e68aef7-c088-485c-8d5f-0a681bed67ae" Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.844051 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:09:34.844093 master-0 kubenswrapper[29252]: I1203 20:09:34.844093 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:34.844312 master-0 kubenswrapper[29252]: I1203 20:09:34.844174 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:34.844312 master-0 kubenswrapper[29252]: I1203 20:09:34.844235 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-25z8n" Dec 03 20:09:34.844524 master-0 kubenswrapper[29252]: I1203 20:09:34.844449 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:34.844579 master-0 kubenswrapper[29252]: I1203 20:09:34.844546 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"ee2aaab9b8550f344df3e7445ae5d2dcb743224979d469841109025bb15970fd"} Dec 03 20:09:34.844618 master-0 kubenswrapper[29252]: I1203 20:09:34.844591 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerDied","Data":"ee2aaab9b8550f344df3e7445ae5d2dcb743224979d469841109025bb15970fd"} Dec 03 20:09:34.844662 master-0 kubenswrapper[29252]: I1203 20:09:34.844623 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-l7bzj" event={"ID":"3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf","Type":"ContainerStarted","Data":"a6d8139c2a289f24bcfbfac9c2365b666fc7666e98ff6bacd47d9799604a7b87"} Dec 03 20:09:34.845010 master-0 kubenswrapper[29252]: I1203 20:09:34.844956 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-8xmrv" Dec 03 20:09:34.845070 master-0 kubenswrapper[29252]: I1203 20:09:34.845015 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-mfwhz" event={"ID":"a185ee17-4b4b-4d20-a8ed-56a2a01f1807","Type":"ContainerStarted","Data":"7df3a91abfdd8c5166f1d44634b884dcb397584ce15e909c4d4da217a2980158"} Dec 03 20:09:34.845070 master-0 kubenswrapper[29252]: I1203 20:09:34.845057 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:34.845184 master-0 kubenswrapper[29252]: I1203 20:09:34.845102 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dbfhg" Dec 03 20:09:34.878202 master-0 kubenswrapper[29252]: E1203 20:09:34.878141 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Dec 03 20:09:34.879410 master-0 kubenswrapper[29252]: E1203 20:09:34.879365 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:35.015910 master-0 kubenswrapper[29252]: I1203 20:09:35.015853 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:35.016135 master-0 kubenswrapper[29252]: I1203 20:09:35.015946 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:35.190807 master-0 kubenswrapper[29252]: I1203 20:09:35.190730 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:35.195395 master-0 kubenswrapper[29252]: I1203 20:09:35.195341 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:09:35.332992 master-0 kubenswrapper[29252]: I1203 20:09:35.332200 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:35.332992 master-0 kubenswrapper[29252]: E1203 20:09:35.332379 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:35.332992 master-0 kubenswrapper[29252]: E1203 20:09:35.332398 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:35.332992 master-0 kubenswrapper[29252]: E1203 20:09:35.332456 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:36.332439866 +0000 UTC m=+11.145984819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:35.403582 master-0 kubenswrapper[29252]: I1203 20:09:35.403534 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:35.408008 master-0 kubenswrapper[29252]: I1203 20:09:35.407984 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-pqz7q" Dec 03 20:09:35.706705 master-0 kubenswrapper[29252]: I1203 20:09:35.706650 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:35.710545 master-0 kubenswrapper[29252]: I1203 20:09:35.710489 29252 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 20:09:35.714056 master-0 kubenswrapper[29252]: I1203 20:09:35.714016 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 03 20:09:35.732361 master-0 kubenswrapper[29252]: I1203 20:09:35.732304 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 03 20:09:35.740798 master-0 kubenswrapper[29252]: E1203 20:09:35.740747 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:35.741185 master-0 kubenswrapper[29252]: I1203 20:09:35.741164 29252 scope.go:117] "RemoveContainer" containerID="ee2aaab9b8550f344df3e7445ae5d2dcb743224979d469841109025bb15970fd" Dec 03 20:09:35.775961 master-0 kubenswrapper[29252]: I1203 20:09:35.775907 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:35.779957 master-0 kubenswrapper[29252]: I1203 20:09:35.779917 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:09:36.349127 master-0 kubenswrapper[29252]: I1203 20:09:36.348971 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:36.349947 master-0 kubenswrapper[29252]: E1203 20:09:36.349154 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:36.349947 master-0 kubenswrapper[29252]: E1203 20:09:36.349173 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:36.349947 master-0 kubenswrapper[29252]: E1203 20:09:36.349223 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:38.349207608 +0000 UTC m=+13.162752561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:36.440528 master-0 kubenswrapper[29252]: I1203 20:09:36.440252 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:36.446188 master-0 kubenswrapper[29252]: I1203 20:09:36.446137 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:09:36.721318 master-0 kubenswrapper[29252]: I1203 20:09:36.721269 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-check-endpoints/1.log" Dec 03 20:09:36.724036 master-0 kubenswrapper[29252]: I1203 20:09:36.723968 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"efa3433149c0833909dd6c97d45272ed","Type":"ContainerStarted","Data":"76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54"} Dec 03 20:09:36.724673 master-0 kubenswrapper[29252]: I1203 20:09:36.724641 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:37.579856 master-0 kubenswrapper[29252]: I1203 20:09:37.579741 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:37.583076 master-0 kubenswrapper[29252]: I1203 20:09:37.583040 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-9b474c48f-lx8ch" Dec 03 20:09:37.717166 master-0 kubenswrapper[29252]: I1203 20:09:37.716120 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=18.716101528 podStartE2EDuration="18.716101528s" podCreationTimestamp="2025-12-03 20:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:09:37.656757297 +0000 UTC m=+12.470302260" watchObservedRunningTime="2025-12-03 20:09:37.716101528 +0000 UTC m=+12.529646481" Dec 03 20:09:37.722988 master-0 kubenswrapper[29252]: I1203 20:09:37.722736 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:38.221117 master-0 kubenswrapper[29252]: I1203 20:09:38.221033 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=4.221011356 podStartE2EDuration="4.221011356s" podCreationTimestamp="2025-12-03 20:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:09:38.202403381 +0000 UTC m=+13.015948344" watchObservedRunningTime="2025-12-03 20:09:38.221011356 +0000 UTC m=+13.034556309" Dec 03 20:09:38.378631 master-0 kubenswrapper[29252]: I1203 20:09:38.378576 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:38.378845 master-0 kubenswrapper[29252]: E1203 20:09:38.378766 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:38.378845 master-0 kubenswrapper[29252]: E1203 20:09:38.378810 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:38.378935 master-0 kubenswrapper[29252]: E1203 20:09:38.378863 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:42.378846478 +0000 UTC m=+17.192391431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:38.787806 master-0 kubenswrapper[29252]: I1203 20:09:38.787711 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:38.903745 master-0 kubenswrapper[29252]: I1203 20:09:38.903669 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:38.910373 master-0 kubenswrapper[29252]: I1203 20:09:38.910301 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-xqvv6" Dec 03 20:09:39.105104 master-0 kubenswrapper[29252]: I1203 20:09:39.104946 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-b46c54696-bgb45" Dec 03 20:09:39.105104 master-0 kubenswrapper[29252]: I1203 20:09:39.105001 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:39.223567 master-0 kubenswrapper[29252]: I1203 20:09:39.223496 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:39.276117 master-0 kubenswrapper[29252]: I1203 20:09:39.276038 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:39.281396 master-0 kubenswrapper[29252]: I1203 20:09:39.281353 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:39.286358 master-0 kubenswrapper[29252]: I1203 20:09:39.286320 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:39.398466 master-0 kubenswrapper[29252]: I1203 20:09:39.398403 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:39.398751 master-0 kubenswrapper[29252]: I1203 20:09:39.398699 29252 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 03 20:09:39.398916 master-0 kubenswrapper[29252]: I1203 20:09:39.398766 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 20:09:39.752265 master-0 kubenswrapper[29252]: I1203 20:09:39.752161 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:39.757389 master-0 kubenswrapper[29252]: I1203 20:09:39.757326 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-vkcnf" Dec 03 20:09:39.757655 master-0 kubenswrapper[29252]: I1203 20:09:39.757629 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:39.764562 master-0 kubenswrapper[29252]: I1203 20:09:39.764465 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:40.200009 master-0 kubenswrapper[29252]: I1203 20:09:40.199944 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:40.205800 master-0 kubenswrapper[29252]: I1203 20:09:40.205748 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-597ff7d589-qjxsb" Dec 03 20:09:40.568886 master-0 kubenswrapper[29252]: I1203 20:09:40.568754 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:40.610059 master-0 kubenswrapper[29252]: I1203 20:09:40.610022 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcnrx" Dec 03 20:09:41.350307 master-0 kubenswrapper[29252]: I1203 20:09:41.350241 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:41.398308 master-0 kubenswrapper[29252]: I1203 20:09:41.398184 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:41.722242 master-0 kubenswrapper[29252]: I1203 20:09:41.722081 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 20:09:41.724123 master-0 kubenswrapper[29252]: I1203 20:09:41.724087 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x6vwd" Dec 03 20:09:42.434241 master-0 kubenswrapper[29252]: I1203 20:09:42.434137 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:42.435279 master-0 kubenswrapper[29252]: E1203 20:09:42.434312 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:42.435279 master-0 kubenswrapper[29252]: E1203 20:09:42.434333 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:42.435279 master-0 kubenswrapper[29252]: E1203 20:09:42.434390 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:09:50.434372099 +0000 UTC m=+25.247917062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:42.662103 master-0 kubenswrapper[29252]: I1203 20:09:42.662021 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:42.692696 master-0 kubenswrapper[29252]: I1203 20:09:42.692567 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:42.712031 master-0 kubenswrapper[29252]: I1203 20:09:42.711966 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:42.740369 master-0 kubenswrapper[29252]: I1203 20:09:42.739629 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:42.769974 master-0 kubenswrapper[29252]: I1203 20:09:42.769923 29252 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 20:09:43.003277 master-0 kubenswrapper[29252]: I1203 20:09:43.003070 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:43.072491 master-0 kubenswrapper[29252]: I1203 20:09:43.072377 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:43.362069 master-0 kubenswrapper[29252]: I1203 20:09:43.362019 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:43.404928 master-0 kubenswrapper[29252]: I1203 20:09:43.404877 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98lh5" Dec 03 20:09:43.454593 master-0 kubenswrapper[29252]: I1203 20:09:43.454513 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:43.457959 master-0 kubenswrapper[29252]: I1203 20:09:43.457912 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-wg7fw" Dec 03 20:09:43.621700 master-0 kubenswrapper[29252]: I1203 20:09:43.621575 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:43.774847 master-0 kubenswrapper[29252]: I1203 20:09:43.774788 29252 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 20:09:43.777591 master-0 kubenswrapper[29252]: I1203 20:09:43.777555 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mg96g" Dec 03 20:09:44.644756 master-0 kubenswrapper[29252]: I1203 20:09:44.644565 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:44.645922 master-0 kubenswrapper[29252]: I1203 20:09:44.645897 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-xfv5j" Dec 03 20:09:46.009105 master-0 kubenswrapper[29252]: I1203 20:09:46.009034 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:09:46.010394 master-0 kubenswrapper[29252]: I1203 20:09:46.009929 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:09:48.171675 master-0 kubenswrapper[29252]: I1203 20:09:48.171473 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:48.172963 master-0 kubenswrapper[29252]: I1203 20:09:48.171731 29252 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 20:09:48.203566 master-0 kubenswrapper[29252]: I1203 20:09:48.203501 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l9m2r" Dec 03 20:09:48.838300 master-0 kubenswrapper[29252]: I1203 20:09:48.838240 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:48.877504 master-0 kubenswrapper[29252]: I1203 20:09:48.877439 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9smb5" Dec 03 20:09:49.399194 master-0 kubenswrapper[29252]: I1203 20:09:49.399125 29252 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 03 20:09:49.399194 master-0 kubenswrapper[29252]: I1203 20:09:49.399184 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 20:09:50.473213 master-0 kubenswrapper[29252]: I1203 20:09:50.473141 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:09:50.473807 master-0 kubenswrapper[29252]: E1203 20:09:50.473444 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:50.473807 master-0 kubenswrapper[29252]: E1203 20:09:50.473502 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:50.473807 master-0 kubenswrapper[29252]: E1203 20:09:50.473602 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:10:06.473571628 +0000 UTC m=+41.287116621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:09:52.180197 master-0 kubenswrapper[29252]: I1203 20:09:52.180119 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:09:58.137818 master-0 kubenswrapper[29252]: I1203 20:09:58.137733 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:09:58.138484 master-0 kubenswrapper[29252]: I1203 20:09:58.138050 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="b0f7c518a656139710b17a7667c8b898" containerName="startup-monitor" containerID="cri-o://10dd5e50757ca6d8fb428d9d41440e88b1cc3fce51685a0860bb2b0898ea0950" gracePeriod=5 Dec 03 20:09:59.399416 master-0 kubenswrapper[29252]: I1203 20:09:59.399358 29252 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 03 20:09:59.399416 master-0 kubenswrapper[29252]: I1203 20:09:59.399436 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 20:09:59.400139 master-0 kubenswrapper[29252]: I1203 20:09:59.399487 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:09:59.400139 master-0 kubenswrapper[29252]: I1203 20:09:59.400011 29252 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 03 20:09:59.400225 master-0 kubenswrapper[29252]: I1203 20:09:59.400174 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" containerID="cri-o://e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" gracePeriod=30 Dec 03 20:10:03.916720 master-0 kubenswrapper[29252]: I1203 20:10:03.916641 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_b0f7c518a656139710b17a7667c8b898/startup-monitor/0.log" Dec 03 20:10:03.916720 master-0 kubenswrapper[29252]: I1203 20:10:03.916710 29252 generic.go:334] "Generic (PLEG): container finished" podID="b0f7c518a656139710b17a7667c8b898" containerID="10dd5e50757ca6d8fb428d9d41440e88b1cc3fce51685a0860bb2b0898ea0950" exitCode=137 Dec 03 20:10:03.917754 master-0 kubenswrapper[29252]: I1203 20:10:03.916755 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcef5c26197d88811bd202bc70d1bd384b05a27d2d38eb35b486b482203bd347" Dec 03 20:10:03.920927 master-0 kubenswrapper[29252]: I1203 20:10:03.920813 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_b0f7c518a656139710b17a7667c8b898/startup-monitor/0.log" Dec 03 20:10:03.920927 master-0 kubenswrapper[29252]: I1203 20:10:03.920868 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:10:03.960573 master-0 kubenswrapper[29252]: I1203 20:10:03.960505 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") pod \"b0f7c518a656139710b17a7667c8b898\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " Dec 03 20:10:03.960789 master-0 kubenswrapper[29252]: I1203 20:10:03.960632 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") pod \"b0f7c518a656139710b17a7667c8b898\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " Dec 03 20:10:03.960789 master-0 kubenswrapper[29252]: I1203 20:10:03.960691 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") pod \"b0f7c518a656139710b17a7667c8b898\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " Dec 03 20:10:03.960789 master-0 kubenswrapper[29252]: I1203 20:10:03.960688 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b0f7c518a656139710b17a7667c8b898" (UID: "b0f7c518a656139710b17a7667c8b898"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:03.960896 master-0 kubenswrapper[29252]: I1203 20:10:03.960728 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") pod \"b0f7c518a656139710b17a7667c8b898\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " Dec 03 20:10:03.960896 master-0 kubenswrapper[29252]: I1203 20:10:03.960878 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock" (OuterVolumeSpecName: "var-lock") pod "b0f7c518a656139710b17a7667c8b898" (UID: "b0f7c518a656139710b17a7667c8b898"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:03.960954 master-0 kubenswrapper[29252]: I1203 20:10:03.960866 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests" (OuterVolumeSpecName: "manifests") pod "b0f7c518a656139710b17a7667c8b898" (UID: "b0f7c518a656139710b17a7667c8b898"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:03.960990 master-0 kubenswrapper[29252]: I1203 20:10:03.960953 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") pod \"b0f7c518a656139710b17a7667c8b898\" (UID: \"b0f7c518a656139710b17a7667c8b898\") " Dec 03 20:10:03.961109 master-0 kubenswrapper[29252]: I1203 20:10:03.961070 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log" (OuterVolumeSpecName: "var-log") pod "b0f7c518a656139710b17a7667c8b898" (UID: "b0f7c518a656139710b17a7667c8b898"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:03.961573 master-0 kubenswrapper[29252]: I1203 20:10:03.961537 29252 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-log\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:03.961573 master-0 kubenswrapper[29252]: I1203 20:10:03.961567 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:03.961648 master-0 kubenswrapper[29252]: I1203 20:10:03.961580 29252 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-manifests\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:03.961648 master-0 kubenswrapper[29252]: I1203 20:10:03.961591 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:03.968697 master-0 kubenswrapper[29252]: I1203 20:10:03.968585 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "b0f7c518a656139710b17a7667c8b898" (UID: "b0f7c518a656139710b17a7667c8b898"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:04.063314 master-0 kubenswrapper[29252]: I1203 20:10:04.063231 29252 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b0f7c518a656139710b17a7667c8b898-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:04.923019 master-0 kubenswrapper[29252]: I1203 20:10:04.922922 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:10:04.975234 master-0 kubenswrapper[29252]: I1203 20:10:04.975134 29252 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="d7e4a6aa-13c6-4151-b773-bdc0b0b6ea31" Dec 03 20:10:05.430065 master-0 kubenswrapper[29252]: I1203 20:10:05.429895 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f7c518a656139710b17a7667c8b898" path="/var/lib/kubelet/pods/b0f7c518a656139710b17a7667c8b898/volumes" Dec 03 20:10:05.430456 master-0 kubenswrapper[29252]: I1203 20:10:05.430401 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Dec 03 20:10:05.639934 master-0 kubenswrapper[29252]: I1203 20:10:05.639821 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:10:05.639934 master-0 kubenswrapper[29252]: I1203 20:10:05.639892 29252 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="d7e4a6aa-13c6-4151-b773-bdc0b0b6ea31" Dec 03 20:10:05.641541 master-0 kubenswrapper[29252]: I1203 20:10:05.641469 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:10:05.641697 master-0 kubenswrapper[29252]: I1203 20:10:05.641544 29252 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="d7e4a6aa-13c6-4151-b773-bdc0b0b6ea31" Dec 03 20:10:06.494419 master-0 kubenswrapper[29252]: I1203 20:10:06.494319 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:10:06.495173 master-0 kubenswrapper[29252]: E1203 20:10:06.494541 29252 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:10:06.495173 master-0 kubenswrapper[29252]: E1203 20:10:06.494564 29252 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:10:06.495173 master-0 kubenswrapper[29252]: E1203 20:10:06.494619 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access podName:e73e6013-87fc-40e2-a573-39930828faa7 nodeName:}" failed. No retries permitted until 2025-12-03 20:10:38.49459854 +0000 UTC m=+73.308143493 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "e73e6013-87fc-40e2-a573-39930828faa7") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 20:10:16.009083 master-0 kubenswrapper[29252]: I1203 20:10:16.009034 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:10:16.009732 master-0 kubenswrapper[29252]: I1203 20:10:16.009097 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:10:27.160689 master-0 kubenswrapper[29252]: I1203 20:10:27.160609 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:10:27.163268 master-0 kubenswrapper[29252]: E1203 20:10:27.163229 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:10:27.163268 master-0 kubenswrapper[29252]: I1203 20:10:27.163267 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: E1203 20:10:27.163291 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73e6013-87fc-40e2-a573-39930828faa7" containerName="installer" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: I1203 20:10:27.163305 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73e6013-87fc-40e2-a573-39930828faa7" containerName="installer" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: E1203 20:10:27.163326 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: I1203 20:10:27.163338 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: E1203 20:10:27.163375 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: I1203 20:10:27.163387 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:10:27.163412 master-0 kubenswrapper[29252]: E1203 20:10:27.163409 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: I1203 20:10:27.163422 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: E1203 20:10:27.163438 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: I1203 20:10:27.163452 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: E1203 20:10:27.163474 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: I1203 20:10:27.163486 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: E1203 20:10:27.163505 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f7c518a656139710b17a7667c8b898" containerName="startup-monitor" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: I1203 20:10:27.163517 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f7c518a656139710b17a7667c8b898" containerName="startup-monitor" Dec 03 20:10:27.163695 master-0 kubenswrapper[29252]: I1203 20:10:27.163696 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6e1832-278b-4e37-b92b-2584e2daa34c" containerName="assisted-installer-controller" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163732 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="186cc14f-5f58-43ca-8ffa-db07606ff0f7" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163762 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="bacd155a-fee3-4e5e-89a2-ab86f401d2ff" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163839 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4afc7a-a338-4a2c-bada-22d4bac75d49" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163865 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7367df-4046-4972-abc2-f07eade0ac6b" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163886 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73e6013-87fc-40e2-a573-39930828faa7" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163907 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="9afe01c7-825c-43d1-8425-0317cdde11d6" containerName="installer" Dec 03 20:10:27.164104 master-0 kubenswrapper[29252]: I1203 20:10:27.163923 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f7c518a656139710b17a7667c8b898" containerName="startup-monitor" Dec 03 20:10:27.164545 master-0 kubenswrapper[29252]: I1203 20:10:27.164504 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.168033 master-0 kubenswrapper[29252]: I1203 20:10:27.167967 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-n2brl" Dec 03 20:10:27.168209 master-0 kubenswrapper[29252]: I1203 20:10:27.168168 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 20:10:27.182546 master-0 kubenswrapper[29252]: I1203 20:10:27.182494 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:10:27.292354 master-0 kubenswrapper[29252]: I1203 20:10:27.292303 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.292659 master-0 kubenswrapper[29252]: I1203 20:10:27.292643 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.292744 master-0 kubenswrapper[29252]: I1203 20:10:27.292729 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.394517 master-0 kubenswrapper[29252]: I1203 20:10:27.394450 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.394930 master-0 kubenswrapper[29252]: I1203 20:10:27.394900 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.395192 master-0 kubenswrapper[29252]: I1203 20:10:27.395161 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.395402 master-0 kubenswrapper[29252]: I1203 20:10:27.395017 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.395581 master-0 kubenswrapper[29252]: I1203 20:10:27.395269 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.408984 master-0 kubenswrapper[29252]: I1203 20:10:27.408888 29252 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 20:10:27.414103 master-0 kubenswrapper[29252]: I1203 20:10:27.414009 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.502568 master-0 kubenswrapper[29252]: I1203 20:10:27.502468 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:27.961064 master-0 kubenswrapper[29252]: I1203 20:10:27.960931 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:10:27.970913 master-0 kubenswrapper[29252]: W1203 20:10:27.970857 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod07bd5fbb_a709_44d4_a6a7_b09c89338bd6.slice/crio-f7112ad6a359b0ed68657eb14eb0487cdf0b5d7b63e1eb39a83e92e1bf8cc377 WatchSource:0}: Error finding container f7112ad6a359b0ed68657eb14eb0487cdf0b5d7b63e1eb39a83e92e1bf8cc377: Status 404 returned error can't find the container with id f7112ad6a359b0ed68657eb14eb0487cdf0b5d7b63e1eb39a83e92e1bf8cc377 Dec 03 20:10:28.103464 master-0 kubenswrapper[29252]: I1203 20:10:28.101902 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"07bd5fbb-a709-44d4-a6a7-b09c89338bd6","Type":"ContainerStarted","Data":"f7112ad6a359b0ed68657eb14eb0487cdf0b5d7b63e1eb39a83e92e1bf8cc377"} Dec 03 20:10:28.797609 master-0 kubenswrapper[29252]: I1203 20:10:28.797522 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 20:10:28.798710 master-0 kubenswrapper[29252]: I1203 20:10:28.798665 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:28.801398 master-0 kubenswrapper[29252]: I1203 20:10:28.801337 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-k7vtq" Dec 03 20:10:28.801764 master-0 kubenswrapper[29252]: I1203 20:10:28.801696 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 03 20:10:28.922774 master-0 kubenswrapper[29252]: I1203 20:10:28.922676 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:28.922774 master-0 kubenswrapper[29252]: I1203 20:10:28.922767 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:28.923149 master-0 kubenswrapper[29252]: I1203 20:10:28.922857 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.016215 master-0 kubenswrapper[29252]: I1203 20:10:29.016088 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 20:10:29.025094 master-0 kubenswrapper[29252]: I1203 20:10:29.025012 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.025328 master-0 kubenswrapper[29252]: I1203 20:10:29.025130 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.025328 master-0 kubenswrapper[29252]: I1203 20:10:29.025223 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.025506 master-0 kubenswrapper[29252]: I1203 20:10:29.025397 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.025506 master-0 kubenswrapper[29252]: I1203 20:10:29.025489 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.060421 master-0 kubenswrapper[29252]: I1203 20:10:29.060229 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.111333 master-0 kubenswrapper[29252]: I1203 20:10:29.111199 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"07bd5fbb-a709-44d4-a6a7-b09c89338bd6","Type":"ContainerStarted","Data":"f228c28da9f27c23c7c1448ceb7fc6840524250a71bc1e89773cf47e5155f8d1"} Dec 03 20:10:29.123531 master-0 kubenswrapper[29252]: I1203 20:10:29.123456 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:10:29.371264 master-0 kubenswrapper[29252]: I1203 20:10:29.371205 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.371187336 podStartE2EDuration="2.371187336s" podCreationTimestamp="2025-12-03 20:10:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:10:29.137009559 +0000 UTC m=+63.950554592" watchObservedRunningTime="2025-12-03 20:10:29.371187336 +0000 UTC m=+64.184732289" Dec 03 20:10:29.373415 master-0 kubenswrapper[29252]: I1203 20:10:29.373385 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 20:10:29.381339 master-0 kubenswrapper[29252]: W1203 20:10:29.381260 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda0b70764_d088_42e7_8480_e7bbdac661a8.slice/crio-8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c WatchSource:0}: Error finding container 8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c: Status 404 returned error can't find the container with id 8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c Dec 03 20:10:30.121564 master-0 kubenswrapper[29252]: I1203 20:10:30.121527 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"a0b70764-d088-42e7-8480-e7bbdac661a8","Type":"ContainerStarted","Data":"f1a19aa9dbc35ef2d180c0716648536cd35334706c401525c103c7c6dd360e46"} Dec 03 20:10:30.121564 master-0 kubenswrapper[29252]: I1203 20:10:30.121568 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"a0b70764-d088-42e7-8480-e7bbdac661a8","Type":"ContainerStarted","Data":"8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c"} Dec 03 20:10:30.130300 master-0 kubenswrapper[29252]: I1203 20:10:30.130113 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager/0.log" Dec 03 20:10:30.130300 master-0 kubenswrapper[29252]: I1203 20:10:30.130174 29252 generic.go:334] "Generic (PLEG): container finished" podID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" exitCode=137 Dec 03 20:10:30.130300 master-0 kubenswrapper[29252]: I1203 20:10:30.130278 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerDied","Data":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} Dec 03 20:10:30.140749 master-0 kubenswrapper[29252]: I1203 20:10:30.140530 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.140510051 podStartE2EDuration="2.140510051s" podCreationTimestamp="2025-12-03 20:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:10:30.138130407 +0000 UTC m=+64.951675360" watchObservedRunningTime="2025-12-03 20:10:30.140510051 +0000 UTC m=+64.954055004" Dec 03 20:10:31.148597 master-0 kubenswrapper[29252]: I1203 20:10:31.148538 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager/0.log" Dec 03 20:10:31.149678 master-0 kubenswrapper[29252]: I1203 20:10:31.149642 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"dc347c4e75ec09c3a7fea6a3ba3ee63c","Type":"ContainerStarted","Data":"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972"} Dec 03 20:10:35.546038 master-0 kubenswrapper[29252]: I1203 20:10:35.545960 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 20:10:35.547114 master-0 kubenswrapper[29252]: I1203 20:10:35.546771 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.549062 master-0 kubenswrapper[29252]: I1203 20:10:35.549002 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 20:10:35.549342 master-0 kubenswrapper[29252]: I1203 20:10:35.549293 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nmjr4" Dec 03 20:10:35.612536 master-0 kubenswrapper[29252]: I1203 20:10:35.612472 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.612815 master-0 kubenswrapper[29252]: I1203 20:10:35.612575 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.612815 master-0 kubenswrapper[29252]: I1203 20:10:35.612633 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.698916 master-0 kubenswrapper[29252]: I1203 20:10:35.698870 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 20:10:35.713819 master-0 kubenswrapper[29252]: I1203 20:10:35.713661 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.713819 master-0 kubenswrapper[29252]: I1203 20:10:35.713725 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.713819 master-0 kubenswrapper[29252]: I1203 20:10:35.713795 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.714159 master-0 kubenswrapper[29252]: I1203 20:10:35.713889 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.714159 master-0 kubenswrapper[29252]: I1203 20:10:35.713892 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.786629 master-0 kubenswrapper[29252]: I1203 20:10:35.786523 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:35.886574 master-0 kubenswrapper[29252]: I1203 20:10:35.886502 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:10:36.318578 master-0 kubenswrapper[29252]: I1203 20:10:36.318207 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 20:10:37.194037 master-0 kubenswrapper[29252]: I1203 20:10:37.193963 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"592c876e-547f-4a90-b590-d0960feade3d","Type":"ContainerStarted","Data":"4a7fa4d96701d6527679cbd2ed599aaa0d5dd8b3d6f7e93c9b245d86685ac09a"} Dec 03 20:10:37.194037 master-0 kubenswrapper[29252]: I1203 20:10:37.194034 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"592c876e-547f-4a90-b590-d0960feade3d","Type":"ContainerStarted","Data":"5e17743465d65eb57b525b718b3136ed898ce0a19e30a126fc3ba5273d4941c4"} Dec 03 20:10:37.213929 master-0 kubenswrapper[29252]: I1203 20:10:37.213829 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.21380634 podStartE2EDuration="2.21380634s" podCreationTimestamp="2025-12-03 20:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:10:37.211509058 +0000 UTC m=+72.025054011" watchObservedRunningTime="2025-12-03 20:10:37.21380634 +0000 UTC m=+72.027351303" Dec 03 20:10:38.551973 master-0 kubenswrapper[29252]: I1203 20:10:38.551908 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:10:38.557306 master-0 kubenswrapper[29252]: I1203 20:10:38.555360 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 03 20:10:38.653318 master-0 kubenswrapper[29252]: I1203 20:10:38.653225 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") pod \"e73e6013-87fc-40e2-a573-39930828faa7\" (UID: \"e73e6013-87fc-40e2-a573-39930828faa7\") " Dec 03 20:10:38.657614 master-0 kubenswrapper[29252]: I1203 20:10:38.657532 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e73e6013-87fc-40e2-a573-39930828faa7" (UID: "e73e6013-87fc-40e2-a573-39930828faa7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:10:38.754707 master-0 kubenswrapper[29252]: I1203 20:10:38.754613 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e73e6013-87fc-40e2-a573-39930828faa7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:39.398865 master-0 kubenswrapper[29252]: I1203 20:10:39.398746 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:10:39.406914 master-0 kubenswrapper[29252]: I1203 20:10:39.406861 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:10:39.765604 master-0 kubenswrapper[29252]: I1203 20:10:39.765442 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:10:40.224788 master-0 kubenswrapper[29252]: I1203 20:10:40.224718 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:10:42.163682 master-0 kubenswrapper[29252]: I1203 20:10:42.163635 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:10:42.164750 master-0 kubenswrapper[29252]: I1203 20:10:42.164710 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" containerName="installer" containerID="cri-o://f228c28da9f27c23c7c1448ceb7fc6840524250a71bc1e89773cf47e5155f8d1" gracePeriod=30 Dec 03 20:10:45.369554 master-0 kubenswrapper[29252]: I1203 20:10:45.369460 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 20:10:45.370604 master-0 kubenswrapper[29252]: I1203 20:10:45.370552 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.379188 master-0 kubenswrapper[29252]: I1203 20:10:45.379111 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 20:10:45.440000 master-0 kubenswrapper[29252]: I1203 20:10:45.439918 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.440000 master-0 kubenswrapper[29252]: I1203 20:10:45.439990 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.440000 master-0 kubenswrapper[29252]: I1203 20:10:45.440009 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.541583 master-0 kubenswrapper[29252]: I1203 20:10:45.541507 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.541851 master-0 kubenswrapper[29252]: I1203 20:10:45.541626 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.541851 master-0 kubenswrapper[29252]: I1203 20:10:45.541666 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.541851 master-0 kubenswrapper[29252]: I1203 20:10:45.541744 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.542343 master-0 kubenswrapper[29252]: I1203 20:10:45.542055 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.558216 master-0 kubenswrapper[29252]: I1203 20:10:45.558152 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access\") pod \"installer-3-master-0\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:45.686638 master-0 kubenswrapper[29252]: I1203 20:10:45.686467 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:10:46.009501 master-0 kubenswrapper[29252]: I1203 20:10:46.009330 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:10:46.009501 master-0 kubenswrapper[29252]: I1203 20:10:46.009419 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:10:46.009501 master-0 kubenswrapper[29252]: I1203 20:10:46.009484 29252 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:10:46.010276 master-0 kubenswrapper[29252]: I1203 20:10:46.010225 29252 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a"} pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 20:10:46.010410 master-0 kubenswrapper[29252]: I1203 20:10:46.010325 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" containerID="cri-o://07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a" gracePeriod=600 Dec 03 20:10:46.193893 master-0 kubenswrapper[29252]: I1203 20:10:46.193843 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 20:10:46.203607 master-0 kubenswrapper[29252]: W1203 20:10:46.203346 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbfb85302_c965_417f_8c35_9aff2e464281.slice/crio-a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5 WatchSource:0}: Error finding container a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5: Status 404 returned error can't find the container with id a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5 Dec 03 20:10:46.265215 master-0 kubenswrapper[29252]: I1203 20:10:46.265118 29252 generic.go:334] "Generic (PLEG): container finished" podID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerID="07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a" exitCode=0 Dec 03 20:10:46.265324 master-0 kubenswrapper[29252]: I1203 20:10:46.265197 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerDied","Data":"07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a"} Dec 03 20:10:46.265324 master-0 kubenswrapper[29252]: I1203 20:10:46.265304 29252 scope.go:117] "RemoveContainer" containerID="cf0ee7669690c522329aaa6f304ff26947b64db398890f4e63ca209b2410a161" Dec 03 20:10:46.266704 master-0 kubenswrapper[29252]: I1203 20:10:46.266671 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"bfb85302-c965-417f-8c35-9aff2e464281","Type":"ContainerStarted","Data":"a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5"} Dec 03 20:10:47.289910 master-0 kubenswrapper[29252]: I1203 20:10:47.289851 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"bfb85302-c965-417f-8c35-9aff2e464281","Type":"ContainerStarted","Data":"9e2571a613c1c3cac0791e5a3b10370d1d77ae345a0fbb29706cdeb1555a8b96"} Dec 03 20:10:47.294907 master-0 kubenswrapper[29252]: I1203 20:10:47.293982 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"4056ba5f9d6e5d99d04379037a16b03e163ddc085064bce80ce77bdbefd30aec"} Dec 03 20:10:47.326055 master-0 kubenswrapper[29252]: I1203 20:10:47.325985 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.3259660970000002 podStartE2EDuration="2.325966097s" podCreationTimestamp="2025-12-03 20:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:10:47.325455365 +0000 UTC m=+82.139000358" watchObservedRunningTime="2025-12-03 20:10:47.325966097 +0000 UTC m=+82.139511050" Dec 03 20:10:59.384467 master-0 kubenswrapper[29252]: I1203 20:10:59.384394 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_07bd5fbb-a709-44d4-a6a7-b09c89338bd6/installer/0.log" Dec 03 20:10:59.384467 master-0 kubenswrapper[29252]: I1203 20:10:59.384447 29252 generic.go:334] "Generic (PLEG): container finished" podID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" containerID="f228c28da9f27c23c7c1448ceb7fc6840524250a71bc1e89773cf47e5155f8d1" exitCode=1 Dec 03 20:10:59.384467 master-0 kubenswrapper[29252]: I1203 20:10:59.384475 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"07bd5fbb-a709-44d4-a6a7-b09c89338bd6","Type":"ContainerDied","Data":"f228c28da9f27c23c7c1448ceb7fc6840524250a71bc1e89773cf47e5155f8d1"} Dec 03 20:10:59.705558 master-0 kubenswrapper[29252]: I1203 20:10:59.705528 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_07bd5fbb-a709-44d4-a6a7-b09c89338bd6/installer/0.log" Dec 03 20:10:59.705848 master-0 kubenswrapper[29252]: I1203 20:10:59.705828 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:10:59.759876 master-0 kubenswrapper[29252]: I1203 20:10:59.758552 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir\") pod \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " Dec 03 20:10:59.759876 master-0 kubenswrapper[29252]: I1203 20:10:59.758946 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock\") pod \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " Dec 03 20:10:59.759876 master-0 kubenswrapper[29252]: I1203 20:10:59.759132 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access\") pod \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\" (UID: \"07bd5fbb-a709-44d4-a6a7-b09c89338bd6\") " Dec 03 20:10:59.760551 master-0 kubenswrapper[29252]: I1203 20:10:59.760508 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock" (OuterVolumeSpecName: "var-lock") pod "07bd5fbb-a709-44d4-a6a7-b09c89338bd6" (UID: "07bd5fbb-a709-44d4-a6a7-b09c89338bd6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:59.760736 master-0 kubenswrapper[29252]: I1203 20:10:59.760677 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "07bd5fbb-a709-44d4-a6a7-b09c89338bd6" (UID: "07bd5fbb-a709-44d4-a6a7-b09c89338bd6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:10:59.764368 master-0 kubenswrapper[29252]: I1203 20:10:59.764277 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "07bd5fbb-a709-44d4-a6a7-b09c89338bd6" (UID: "07bd5fbb-a709-44d4-a6a7-b09c89338bd6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:10:59.861258 master-0 kubenswrapper[29252]: I1203 20:10:59.861159 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:59.861258 master-0 kubenswrapper[29252]: I1203 20:10:59.861216 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:10:59.861258 master-0 kubenswrapper[29252]: I1203 20:10:59.861235 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/07bd5fbb-a709-44d4-a6a7-b09c89338bd6-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:00.391237 master-0 kubenswrapper[29252]: I1203 20:11:00.391163 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_07bd5fbb-a709-44d4-a6a7-b09c89338bd6/installer/0.log" Dec 03 20:11:00.391237 master-0 kubenswrapper[29252]: I1203 20:11:00.391234 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"07bd5fbb-a709-44d4-a6a7-b09c89338bd6","Type":"ContainerDied","Data":"f7112ad6a359b0ed68657eb14eb0487cdf0b5d7b63e1eb39a83e92e1bf8cc377"} Dec 03 20:11:00.392184 master-0 kubenswrapper[29252]: I1203 20:11:00.391272 29252 scope.go:117] "RemoveContainer" containerID="f228c28da9f27c23c7c1448ceb7fc6840524250a71bc1e89773cf47e5155f8d1" Dec 03 20:11:00.392184 master-0 kubenswrapper[29252]: I1203 20:11:00.391351 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 20:11:00.437728 master-0 kubenswrapper[29252]: I1203 20:11:00.437649 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:11:00.454014 master-0 kubenswrapper[29252]: I1203 20:11:00.453966 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 20:11:00.975744 master-0 kubenswrapper[29252]: I1203 20:11:00.975669 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:11:00.976370 master-0 kubenswrapper[29252]: I1203 20:11:00.976270 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler" containerID="cri-o://73fd77c7f3160f50b85cebcaf7773a33c44b0958115b084cb590bef38d48ba5c" gracePeriod=30 Dec 03 20:11:00.976370 master-0 kubenswrapper[29252]: I1203 20:11:00.976344 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-recovery-controller" containerID="cri-o://a72510073f92e9ff068e8652b1a65285f64ee333e40d80be23e60bf13a3ce72d" gracePeriod=30 Dec 03 20:11:00.976551 master-0 kubenswrapper[29252]: I1203 20:11:00.976328 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" containerID="cri-o://88a426b4c066f4efd6c67dba2d50d1674139b8757075139f8541302d74a32ce6" gracePeriod=30 Dec 03 20:11:00.977277 master-0 kubenswrapper[29252]: I1203 20:11:00.977228 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:11:00.977588 master-0 kubenswrapper[29252]: E1203 20:11:00.977551 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-recovery-controller" Dec 03 20:11:00.977588 master-0 kubenswrapper[29252]: I1203 20:11:00.977582 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-recovery-controller" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: E1203 20:11:00.977630 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: I1203 20:11:00.977645 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: E1203 20:11:00.977664 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: I1203 20:11:00.977677 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: E1203 20:11:00.977695 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" containerName="installer" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: I1203 20:11:00.977706 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" containerName="installer" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: E1203 20:11:00.977724 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="wait-for-host-port" Dec 03 20:11:00.977743 master-0 kubenswrapper[29252]: I1203 20:11:00.977735 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="wait-for-host-port" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.977935 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="wait-for-host-port" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.977964 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" containerName="installer" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.978000 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-recovery-controller" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.978019 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.978034 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler" Dec 03 20:11:00.978199 master-0 kubenswrapper[29252]: I1203 20:11:00.978051 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:00.978517 master-0 kubenswrapper[29252]: E1203 20:11:00.978225 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:00.978517 master-0 kubenswrapper[29252]: I1203 20:11:00.978241 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46583dca69d50bb12bc004d7ee3300f" containerName="kube-scheduler-cert-syncer" Dec 03 20:11:01.081076 master-0 kubenswrapper[29252]: I1203 20:11:01.080948 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.081491 master-0 kubenswrapper[29252]: I1203 20:11:01.081164 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.182672 master-0 kubenswrapper[29252]: I1203 20:11:01.182615 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.182866 master-0 kubenswrapper[29252]: I1203 20:11:01.182704 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.182866 master-0 kubenswrapper[29252]: I1203 20:11:01.182809 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.182866 master-0 kubenswrapper[29252]: I1203 20:11:01.182833 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.232440 master-0 kubenswrapper[29252]: I1203 20:11:01.232325 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/1.log" Dec 03 20:11:01.233837 master-0 kubenswrapper[29252]: I1203 20:11:01.233774 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/0.log" Dec 03 20:11:01.235231 master-0 kubenswrapper[29252]: I1203 20:11:01.235172 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.240463 master-0 kubenswrapper[29252]: I1203 20:11:01.240361 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="c46583dca69d50bb12bc004d7ee3300f" podUID="fd2fa610bb2a39c39fcdd00db03a511a" Dec 03 20:11:01.385520 master-0 kubenswrapper[29252]: I1203 20:11:01.385128 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") pod \"c46583dca69d50bb12bc004d7ee3300f\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " Dec 03 20:11:01.385520 master-0 kubenswrapper[29252]: I1203 20:11:01.385226 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") pod \"c46583dca69d50bb12bc004d7ee3300f\" (UID: \"c46583dca69d50bb12bc004d7ee3300f\") " Dec 03 20:11:01.385520 master-0 kubenswrapper[29252]: I1203 20:11:01.385472 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "c46583dca69d50bb12bc004d7ee3300f" (UID: "c46583dca69d50bb12bc004d7ee3300f"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:01.386091 master-0 kubenswrapper[29252]: I1203 20:11:01.385607 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "c46583dca69d50bb12bc004d7ee3300f" (UID: "c46583dca69d50bb12bc004d7ee3300f"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:01.386091 master-0 kubenswrapper[29252]: I1203 20:11:01.385914 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:01.386091 master-0 kubenswrapper[29252]: I1203 20:11:01.385950 29252 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c46583dca69d50bb12bc004d7ee3300f-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:01.406444 master-0 kubenswrapper[29252]: I1203 20:11:01.406368 29252 generic.go:334] "Generic (PLEG): container finished" podID="a0b70764-d088-42e7-8480-e7bbdac661a8" containerID="f1a19aa9dbc35ef2d180c0716648536cd35334706c401525c103c7c6dd360e46" exitCode=0 Dec 03 20:11:01.406444 master-0 kubenswrapper[29252]: I1203 20:11:01.406417 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"a0b70764-d088-42e7-8480-e7bbdac661a8","Type":"ContainerDied","Data":"f1a19aa9dbc35ef2d180c0716648536cd35334706c401525c103c7c6dd360e46"} Dec 03 20:11:01.411658 master-0 kubenswrapper[29252]: I1203 20:11:01.411600 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/1.log" Dec 03 20:11:01.413553 master-0 kubenswrapper[29252]: I1203 20:11:01.413501 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/0.log" Dec 03 20:11:01.415045 master-0 kubenswrapper[29252]: I1203 20:11:01.415000 29252 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="88a426b4c066f4efd6c67dba2d50d1674139b8757075139f8541302d74a32ce6" exitCode=2 Dec 03 20:11:01.415474 master-0 kubenswrapper[29252]: I1203 20:11:01.415033 29252 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="a72510073f92e9ff068e8652b1a65285f64ee333e40d80be23e60bf13a3ce72d" exitCode=0 Dec 03 20:11:01.415474 master-0 kubenswrapper[29252]: I1203 20:11:01.415074 29252 generic.go:334] "Generic (PLEG): container finished" podID="c46583dca69d50bb12bc004d7ee3300f" containerID="73fd77c7f3160f50b85cebcaf7773a33c44b0958115b084cb590bef38d48ba5c" exitCode=0 Dec 03 20:11:01.415474 master-0 kubenswrapper[29252]: I1203 20:11:01.415146 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9703a47499dc38f6845f2d55184a1985a6a96f9f0e663c0707d6562d50b0c0c" Dec 03 20:11:01.415474 master-0 kubenswrapper[29252]: I1203 20:11:01.415165 29252 scope.go:117] "RemoveContainer" containerID="89262883631fb1dcd59cc7a0a7e0379a0e77dd0b25dc2b21a16372a6fe8d007e" Dec 03 20:11:01.415474 master-0 kubenswrapper[29252]: I1203 20:11:01.415167 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:01.433627 master-0 kubenswrapper[29252]: I1203 20:11:01.433550 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07bd5fbb-a709-44d4-a6a7-b09c89338bd6" path="/var/lib/kubelet/pods/07bd5fbb-a709-44d4-a6a7-b09c89338bd6/volumes" Dec 03 20:11:01.434398 master-0 kubenswrapper[29252]: I1203 20:11:01.434359 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46583dca69d50bb12bc004d7ee3300f" path="/var/lib/kubelet/pods/c46583dca69d50bb12bc004d7ee3300f/volumes" Dec 03 20:11:01.437244 master-0 kubenswrapper[29252]: I1203 20:11:01.437203 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="c46583dca69d50bb12bc004d7ee3300f" podUID="fd2fa610bb2a39c39fcdd00db03a511a" Dec 03 20:11:01.486013 master-0 kubenswrapper[29252]: I1203 20:11:01.485874 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="c46583dca69d50bb12bc004d7ee3300f" podUID="fd2fa610bb2a39c39fcdd00db03a511a" Dec 03 20:11:02.423449 master-0 kubenswrapper[29252]: I1203 20:11:02.423358 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_c46583dca69d50bb12bc004d7ee3300f/kube-scheduler-cert-syncer/1.log" Dec 03 20:11:02.721232 master-0 kubenswrapper[29252]: I1203 20:11:02.721160 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:11:02.808072 master-0 kubenswrapper[29252]: I1203 20:11:02.807996 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access\") pod \"a0b70764-d088-42e7-8480-e7bbdac661a8\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " Dec 03 20:11:02.808563 master-0 kubenswrapper[29252]: I1203 20:11:02.808540 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock\") pod \"a0b70764-d088-42e7-8480-e7bbdac661a8\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " Dec 03 20:11:02.808735 master-0 kubenswrapper[29252]: I1203 20:11:02.808662 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock" (OuterVolumeSpecName: "var-lock") pod "a0b70764-d088-42e7-8480-e7bbdac661a8" (UID: "a0b70764-d088-42e7-8480-e7bbdac661a8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:02.808897 master-0 kubenswrapper[29252]: I1203 20:11:02.808875 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir\") pod \"a0b70764-d088-42e7-8480-e7bbdac661a8\" (UID: \"a0b70764-d088-42e7-8480-e7bbdac661a8\") " Dec 03 20:11:02.809117 master-0 kubenswrapper[29252]: I1203 20:11:02.808925 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a0b70764-d088-42e7-8480-e7bbdac661a8" (UID: "a0b70764-d088-42e7-8480-e7bbdac661a8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:02.809438 master-0 kubenswrapper[29252]: I1203 20:11:02.809418 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:02.809522 master-0 kubenswrapper[29252]: I1203 20:11:02.809509 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a0b70764-d088-42e7-8480-e7bbdac661a8-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:02.811794 master-0 kubenswrapper[29252]: I1203 20:11:02.811691 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a0b70764-d088-42e7-8480-e7bbdac661a8" (UID: "a0b70764-d088-42e7-8480-e7bbdac661a8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:11:02.911123 master-0 kubenswrapper[29252]: I1203 20:11:02.910954 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b70764-d088-42e7-8480-e7bbdac661a8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:03.435156 master-0 kubenswrapper[29252]: I1203 20:11:03.435070 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"a0b70764-d088-42e7-8480-e7bbdac661a8","Type":"ContainerDied","Data":"8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c"} Dec 03 20:11:03.435156 master-0 kubenswrapper[29252]: I1203 20:11:03.435129 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 20:11:03.435996 master-0 kubenswrapper[29252]: I1203 20:11:03.435138 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3bbc9828e2a219adad379b51c579857f6028b807a110e7e83684b95a849a4c" Dec 03 20:11:09.393929 master-0 kubenswrapper[29252]: I1203 20:11:09.393836 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:11:09.394761 master-0 kubenswrapper[29252]: I1203 20:11:09.394193 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="cluster-policy-controller" containerID="cri-o://3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" gracePeriod=30 Dec 03 20:11:09.394761 master-0 kubenswrapper[29252]: I1203 20:11:09.394396 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" containerID="cri-o://1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" gracePeriod=30 Dec 03 20:11:09.394761 master-0 kubenswrapper[29252]: I1203 20:11:09.394479 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" gracePeriod=30 Dec 03 20:11:09.394761 master-0 kubenswrapper[29252]: I1203 20:11:09.394540 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" gracePeriod=30 Dec 03 20:11:09.397014 master-0 kubenswrapper[29252]: I1203 20:11:09.396605 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:11:09.397014 master-0 kubenswrapper[29252]: E1203 20:11:09.396942 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.397014 master-0 kubenswrapper[29252]: I1203 20:11:09.396962 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.397014 master-0 kubenswrapper[29252]: E1203 20:11:09.396997 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="cluster-policy-controller" Dec 03 20:11:09.397014 master-0 kubenswrapper[29252]: I1203 20:11:09.397011 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="cluster-policy-controller" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: E1203 20:11:09.397031 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b70764-d088-42e7-8480-e7bbdac661a8" containerName="installer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397083 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b70764-d088-42e7-8480-e7bbdac661a8" containerName="installer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: E1203 20:11:09.397104 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-recovery-controller" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397117 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-recovery-controller" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: E1203 20:11:09.397145 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-cert-syncer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397160 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-cert-syncer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397351 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-recovery-controller" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397370 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b70764-d088-42e7-8480-e7bbdac661a8" containerName="installer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397389 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager-cert-syncer" Dec 03 20:11:09.397412 master-0 kubenswrapper[29252]: I1203 20:11:09.397415 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.397878 master-0 kubenswrapper[29252]: I1203 20:11:09.397434 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.397878 master-0 kubenswrapper[29252]: I1203 20:11:09.397452 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="cluster-policy-controller" Dec 03 20:11:09.397878 master-0 kubenswrapper[29252]: E1203 20:11:09.397642 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.397878 master-0 kubenswrapper[29252]: I1203 20:11:09.397658 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerName="kube-controller-manager" Dec 03 20:11:09.512337 master-0 kubenswrapper[29252]: I1203 20:11:09.512275 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.512536 master-0 kubenswrapper[29252]: I1203 20:11:09.512343 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.613611 master-0 kubenswrapper[29252]: I1203 20:11:09.613516 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.613832 master-0 kubenswrapper[29252]: I1203 20:11:09.613640 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.613832 master-0 kubenswrapper[29252]: I1203 20:11:09.613760 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.613942 master-0 kubenswrapper[29252]: I1203 20:11:09.613840 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"6fb0810126310d28fb5532674012978b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.697430 master-0 kubenswrapper[29252]: I1203 20:11:09.697278 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager-cert-syncer/0.log" Dec 03 20:11:09.698882 master-0 kubenswrapper[29252]: I1203 20:11:09.698839 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager/0.log" Dec 03 20:11:09.699041 master-0 kubenswrapper[29252]: I1203 20:11:09.698929 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:09.703137 master-0 kubenswrapper[29252]: I1203 20:11:09.703049 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" podUID="6fb0810126310d28fb5532674012978b" Dec 03 20:11:09.816095 master-0 kubenswrapper[29252]: I1203 20:11:09.815984 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") pod \"dc347c4e75ec09c3a7fea6a3ba3ee63c\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " Dec 03 20:11:09.816359 master-0 kubenswrapper[29252]: I1203 20:11:09.816135 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") pod \"dc347c4e75ec09c3a7fea6a3ba3ee63c\" (UID: \"dc347c4e75ec09c3a7fea6a3ba3ee63c\") " Dec 03 20:11:09.816359 master-0 kubenswrapper[29252]: I1203 20:11:09.816125 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "dc347c4e75ec09c3a7fea6a3ba3ee63c" (UID: "dc347c4e75ec09c3a7fea6a3ba3ee63c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:09.816359 master-0 kubenswrapper[29252]: I1203 20:11:09.816246 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "dc347c4e75ec09c3a7fea6a3ba3ee63c" (UID: "dc347c4e75ec09c3a7fea6a3ba3ee63c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:09.816562 master-0 kubenswrapper[29252]: I1203 20:11:09.816518 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:09.816562 master-0 kubenswrapper[29252]: I1203 20:11:09.816545 29252 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/dc347c4e75ec09c3a7fea6a3ba3ee63c-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:10.486220 master-0 kubenswrapper[29252]: I1203 20:11:10.486074 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager-cert-syncer/0.log" Dec 03 20:11:10.487963 master-0 kubenswrapper[29252]: I1203 20:11:10.487886 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_dc347c4e75ec09c3a7fea6a3ba3ee63c/kube-controller-manager/0.log" Dec 03 20:11:10.488113 master-0 kubenswrapper[29252]: I1203 20:11:10.487993 29252 generic.go:334] "Generic (PLEG): container finished" podID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" exitCode=0 Dec 03 20:11:10.488113 master-0 kubenswrapper[29252]: I1203 20:11:10.488027 29252 generic.go:334] "Generic (PLEG): container finished" podID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" exitCode=0 Dec 03 20:11:10.488113 master-0 kubenswrapper[29252]: I1203 20:11:10.488044 29252 generic.go:334] "Generic (PLEG): container finished" podID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" exitCode=2 Dec 03 20:11:10.488113 master-0 kubenswrapper[29252]: I1203 20:11:10.488061 29252 generic.go:334] "Generic (PLEG): container finished" podID="dc347c4e75ec09c3a7fea6a3ba3ee63c" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" exitCode=0 Dec 03 20:11:10.488484 master-0 kubenswrapper[29252]: I1203 20:11:10.488167 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:10.488484 master-0 kubenswrapper[29252]: I1203 20:11:10.488198 29252 scope.go:117] "RemoveContainer" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.493081 master-0 kubenswrapper[29252]: I1203 20:11:10.493011 29252 generic.go:334] "Generic (PLEG): container finished" podID="592c876e-547f-4a90-b590-d0960feade3d" containerID="4a7fa4d96701d6527679cbd2ed599aaa0d5dd8b3d6f7e93c9b245d86685ac09a" exitCode=0 Dec 03 20:11:10.493233 master-0 kubenswrapper[29252]: I1203 20:11:10.493079 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"592c876e-547f-4a90-b590-d0960feade3d","Type":"ContainerDied","Data":"4a7fa4d96701d6527679cbd2ed599aaa0d5dd8b3d6f7e93c9b245d86685ac09a"} Dec 03 20:11:10.493233 master-0 kubenswrapper[29252]: I1203 20:11:10.493079 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" podUID="6fb0810126310d28fb5532674012978b" Dec 03 20:11:10.515635 master-0 kubenswrapper[29252]: I1203 20:11:10.515526 29252 scope.go:117] "RemoveContainer" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.538138 master-0 kubenswrapper[29252]: I1203 20:11:10.538061 29252 scope.go:117] "RemoveContainer" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.558062 master-0 kubenswrapper[29252]: I1203 20:11:10.557970 29252 scope.go:117] "RemoveContainer" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.579772 master-0 kubenswrapper[29252]: I1203 20:11:10.579712 29252 scope.go:117] "RemoveContainer" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.594931 master-0 kubenswrapper[29252]: I1203 20:11:10.594874 29252 scope.go:117] "RemoveContainer" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.595525 master-0 kubenswrapper[29252]: E1203 20:11:10.595460 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": container with ID starting with 1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972 not found: ID does not exist" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.595714 master-0 kubenswrapper[29252]: I1203 20:11:10.595539 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972"} err="failed to get container status \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": rpc error: code = NotFound desc = could not find container \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": container with ID starting with 1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972 not found: ID does not exist" Dec 03 20:11:10.595714 master-0 kubenswrapper[29252]: I1203 20:11:10.595578 29252 scope.go:117] "RemoveContainer" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.596642 master-0 kubenswrapper[29252]: E1203 20:11:10.596589 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": container with ID starting with eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2 not found: ID does not exist" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.596757 master-0 kubenswrapper[29252]: I1203 20:11:10.596643 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} err="failed to get container status \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": rpc error: code = NotFound desc = could not find container \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": container with ID starting with eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2 not found: ID does not exist" Dec 03 20:11:10.596757 master-0 kubenswrapper[29252]: I1203 20:11:10.596676 29252 scope.go:117] "RemoveContainer" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.597051 master-0 kubenswrapper[29252]: E1203 20:11:10.596986 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": container with ID starting with 1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9 not found: ID does not exist" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.597142 master-0 kubenswrapper[29252]: I1203 20:11:10.597032 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} err="failed to get container status \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": rpc error: code = NotFound desc = could not find container \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": container with ID starting with 1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9 not found: ID does not exist" Dec 03 20:11:10.597142 master-0 kubenswrapper[29252]: I1203 20:11:10.597068 29252 scope.go:117] "RemoveContainer" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.597394 master-0 kubenswrapper[29252]: E1203 20:11:10.597338 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": container with ID starting with 3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023 not found: ID does not exist" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.597475 master-0 kubenswrapper[29252]: I1203 20:11:10.597384 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} err="failed to get container status \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": rpc error: code = NotFound desc = could not find container \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": container with ID starting with 3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023 not found: ID does not exist" Dec 03 20:11:10.597475 master-0 kubenswrapper[29252]: I1203 20:11:10.597412 29252 scope.go:117] "RemoveContainer" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.597742 master-0 kubenswrapper[29252]: E1203 20:11:10.597695 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": container with ID starting with e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599 not found: ID does not exist" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.597861 master-0 kubenswrapper[29252]: I1203 20:11:10.597738 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} err="failed to get container status \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": rpc error: code = NotFound desc = could not find container \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": container with ID starting with e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599 not found: ID does not exist" Dec 03 20:11:10.597861 master-0 kubenswrapper[29252]: I1203 20:11:10.597768 29252 scope.go:117] "RemoveContainer" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.598179 master-0 kubenswrapper[29252]: I1203 20:11:10.598122 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972"} err="failed to get container status \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": rpc error: code = NotFound desc = could not find container \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": container with ID starting with 1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972 not found: ID does not exist" Dec 03 20:11:10.598179 master-0 kubenswrapper[29252]: I1203 20:11:10.598163 29252 scope.go:117] "RemoveContainer" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.598473 master-0 kubenswrapper[29252]: I1203 20:11:10.598425 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} err="failed to get container status \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": rpc error: code = NotFound desc = could not find container \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": container with ID starting with eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2 not found: ID does not exist" Dec 03 20:11:10.598473 master-0 kubenswrapper[29252]: I1203 20:11:10.598466 29252 scope.go:117] "RemoveContainer" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.598757 master-0 kubenswrapper[29252]: I1203 20:11:10.598723 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} err="failed to get container status \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": rpc error: code = NotFound desc = could not find container \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": container with ID starting with 1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9 not found: ID does not exist" Dec 03 20:11:10.598757 master-0 kubenswrapper[29252]: I1203 20:11:10.598751 29252 scope.go:117] "RemoveContainer" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.599163 master-0 kubenswrapper[29252]: I1203 20:11:10.599104 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} err="failed to get container status \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": rpc error: code = NotFound desc = could not find container \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": container with ID starting with 3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023 not found: ID does not exist" Dec 03 20:11:10.599163 master-0 kubenswrapper[29252]: I1203 20:11:10.599145 29252 scope.go:117] "RemoveContainer" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.599579 master-0 kubenswrapper[29252]: I1203 20:11:10.599526 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} err="failed to get container status \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": rpc error: code = NotFound desc = could not find container \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": container with ID starting with e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599 not found: ID does not exist" Dec 03 20:11:10.599579 master-0 kubenswrapper[29252]: I1203 20:11:10.599563 29252 scope.go:117] "RemoveContainer" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.599867 master-0 kubenswrapper[29252]: I1203 20:11:10.599814 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972"} err="failed to get container status \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": rpc error: code = NotFound desc = could not find container \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": container with ID starting with 1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972 not found: ID does not exist" Dec 03 20:11:10.599867 master-0 kubenswrapper[29252]: I1203 20:11:10.599857 29252 scope.go:117] "RemoveContainer" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.600247 master-0 kubenswrapper[29252]: I1203 20:11:10.600163 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} err="failed to get container status \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": rpc error: code = NotFound desc = could not find container \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": container with ID starting with eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2 not found: ID does not exist" Dec 03 20:11:10.600395 master-0 kubenswrapper[29252]: I1203 20:11:10.600261 29252 scope.go:117] "RemoveContainer" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.600618 master-0 kubenswrapper[29252]: I1203 20:11:10.600562 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} err="failed to get container status \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": rpc error: code = NotFound desc = could not find container \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": container with ID starting with 1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9 not found: ID does not exist" Dec 03 20:11:10.600618 master-0 kubenswrapper[29252]: I1203 20:11:10.600601 29252 scope.go:117] "RemoveContainer" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.600931 master-0 kubenswrapper[29252]: I1203 20:11:10.600879 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} err="failed to get container status \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": rpc error: code = NotFound desc = could not find container \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": container with ID starting with 3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023 not found: ID does not exist" Dec 03 20:11:10.600931 master-0 kubenswrapper[29252]: I1203 20:11:10.600917 29252 scope.go:117] "RemoveContainer" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.601257 master-0 kubenswrapper[29252]: I1203 20:11:10.601220 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} err="failed to get container status \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": rpc error: code = NotFound desc = could not find container \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": container with ID starting with e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599 not found: ID does not exist" Dec 03 20:11:10.601257 master-0 kubenswrapper[29252]: I1203 20:11:10.601248 29252 scope.go:117] "RemoveContainer" containerID="1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972" Dec 03 20:11:10.601680 master-0 kubenswrapper[29252]: I1203 20:11:10.601630 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972"} err="failed to get container status \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": rpc error: code = NotFound desc = could not find container \"1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972\": container with ID starting with 1196c96bf635a9222b3ca8f01c50cd1086b94864486f9ceb7e2d7cb6713d2972 not found: ID does not exist" Dec 03 20:11:10.601680 master-0 kubenswrapper[29252]: I1203 20:11:10.601662 29252 scope.go:117] "RemoveContainer" containerID="eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2" Dec 03 20:11:10.601957 master-0 kubenswrapper[29252]: I1203 20:11:10.601905 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2"} err="failed to get container status \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": rpc error: code = NotFound desc = could not find container \"eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2\": container with ID starting with eeed7e70243b0c7d0cdb518aca4933f934dc9449ecbd09fd90f63a63b5b04ff2 not found: ID does not exist" Dec 03 20:11:10.601957 master-0 kubenswrapper[29252]: I1203 20:11:10.601943 29252 scope.go:117] "RemoveContainer" containerID="1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9" Dec 03 20:11:10.602243 master-0 kubenswrapper[29252]: I1203 20:11:10.602191 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9"} err="failed to get container status \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": rpc error: code = NotFound desc = could not find container \"1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9\": container with ID starting with 1b26e52a787a146d8ac2caad4e6e6c9c8a519ca1b2be254ceb8dde0ac1c05fe9 not found: ID does not exist" Dec 03 20:11:10.602243 master-0 kubenswrapper[29252]: I1203 20:11:10.602227 29252 scope.go:117] "RemoveContainer" containerID="3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023" Dec 03 20:11:10.602492 master-0 kubenswrapper[29252]: I1203 20:11:10.602425 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023"} err="failed to get container status \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": rpc error: code = NotFound desc = could not find container \"3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023\": container with ID starting with 3b121807053834a5a86c6cec0bc4e9326bbd2098380a1a4d422ed38dc9c64023 not found: ID does not exist" Dec 03 20:11:10.602577 master-0 kubenswrapper[29252]: I1203 20:11:10.602533 29252 scope.go:117] "RemoveContainer" containerID="e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599" Dec 03 20:11:10.602824 master-0 kubenswrapper[29252]: I1203 20:11:10.602773 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599"} err="failed to get container status \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": rpc error: code = NotFound desc = could not find container \"e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599\": container with ID starting with e82988e6f715d7b5133ce50f6616c9b7f09245774e57057caa1701115d5ad599 not found: ID does not exist" Dec 03 20:11:10.710996 master-0 kubenswrapper[29252]: I1203 20:11:10.710917 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" podUID="6fb0810126310d28fb5532674012978b" Dec 03 20:11:11.429755 master-0 kubenswrapper[29252]: I1203 20:11:11.429640 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc347c4e75ec09c3a7fea6a3ba3ee63c" path="/var/lib/kubelet/pods/dc347c4e75ec09c3a7fea6a3ba3ee63c/volumes" Dec 03 20:11:11.889246 master-0 kubenswrapper[29252]: I1203 20:11:11.889193 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:11:12.044676 master-0 kubenswrapper[29252]: I1203 20:11:12.044579 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access\") pod \"592c876e-547f-4a90-b590-d0960feade3d\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " Dec 03 20:11:12.045041 master-0 kubenswrapper[29252]: I1203 20:11:12.044731 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock\") pod \"592c876e-547f-4a90-b590-d0960feade3d\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " Dec 03 20:11:12.045041 master-0 kubenswrapper[29252]: I1203 20:11:12.044938 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir\") pod \"592c876e-547f-4a90-b590-d0960feade3d\" (UID: \"592c876e-547f-4a90-b590-d0960feade3d\") " Dec 03 20:11:12.045491 master-0 kubenswrapper[29252]: I1203 20:11:12.045421 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "592c876e-547f-4a90-b590-d0960feade3d" (UID: "592c876e-547f-4a90-b590-d0960feade3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:12.046027 master-0 kubenswrapper[29252]: I1203 20:11:12.045934 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock" (OuterVolumeSpecName: "var-lock") pod "592c876e-547f-4a90-b590-d0960feade3d" (UID: "592c876e-547f-4a90-b590-d0960feade3d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:12.053322 master-0 kubenswrapper[29252]: I1203 20:11:12.053224 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "592c876e-547f-4a90-b590-d0960feade3d" (UID: "592c876e-547f-4a90-b590-d0960feade3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:11:12.146518 master-0 kubenswrapper[29252]: I1203 20:11:12.146389 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:12.146518 master-0 kubenswrapper[29252]: I1203 20:11:12.146466 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/592c876e-547f-4a90-b590-d0960feade3d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:12.146518 master-0 kubenswrapper[29252]: I1203 20:11:12.146483 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/592c876e-547f-4a90-b590-d0960feade3d-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:12.511314 master-0 kubenswrapper[29252]: I1203 20:11:12.511071 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"592c876e-547f-4a90-b590-d0960feade3d","Type":"ContainerDied","Data":"5e17743465d65eb57b525b718b3136ed898ce0a19e30a126fc3ba5273d4941c4"} Dec 03 20:11:12.511314 master-0 kubenswrapper[29252]: I1203 20:11:12.511171 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 20:11:12.511314 master-0 kubenswrapper[29252]: I1203 20:11:12.511178 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e17743465d65eb57b525b718b3136ed898ce0a19e30a126fc3ba5273d4941c4" Dec 03 20:11:15.415690 master-0 kubenswrapper[29252]: I1203 20:11:15.415610 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:15.450296 master-0 kubenswrapper[29252]: I1203 20:11:15.450215 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="69257fc9-fda4-4ec3-b449-d8f9e00b8dd5" Dec 03 20:11:15.450510 master-0 kubenswrapper[29252]: I1203 20:11:15.450307 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="69257fc9-fda4-4ec3-b449-d8f9e00b8dd5" Dec 03 20:11:15.688561 master-0 kubenswrapper[29252]: I1203 20:11:15.688440 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:11:15.691229 master-0 kubenswrapper[29252]: I1203 20:11:15.691168 29252 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:15.696731 master-0 kubenswrapper[29252]: I1203 20:11:15.696672 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:11:15.706844 master-0 kubenswrapper[29252]: I1203 20:11:15.706758 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:15.709947 master-0 kubenswrapper[29252]: I1203 20:11:15.709903 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 20:11:15.739510 master-0 kubenswrapper[29252]: W1203 20:11:15.739437 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd2fa610bb2a39c39fcdd00db03a511a.slice/crio-dd95ea94b048b5219d2e4ac03c51060a3deac31d448e28f9f66783e346287744 WatchSource:0}: Error finding container dd95ea94b048b5219d2e4ac03c51060a3deac31d448e28f9f66783e346287744: Status 404 returned error can't find the container with id dd95ea94b048b5219d2e4ac03c51060a3deac31d448e28f9f66783e346287744 Dec 03 20:11:16.549827 master-0 kubenswrapper[29252]: I1203 20:11:16.549694 29252 generic.go:334] "Generic (PLEG): container finished" podID="fd2fa610bb2a39c39fcdd00db03a511a" containerID="bfa6034f46b6330228197803ed9bf11deb2df12974d65cc4a34352c7d319f496" exitCode=0 Dec 03 20:11:16.549827 master-0 kubenswrapper[29252]: I1203 20:11:16.549768 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerDied","Data":"bfa6034f46b6330228197803ed9bf11deb2df12974d65cc4a34352c7d319f496"} Dec 03 20:11:16.550845 master-0 kubenswrapper[29252]: I1203 20:11:16.549875 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"dd95ea94b048b5219d2e4ac03c51060a3deac31d448e28f9f66783e346287744"} Dec 03 20:11:17.559765 master-0 kubenswrapper[29252]: I1203 20:11:17.559661 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"ee9ca5f9ed3910874de030494704af7f00443b37cd599db2ad909755586e1f38"} Dec 03 20:11:17.559765 master-0 kubenswrapper[29252]: I1203 20:11:17.559725 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"727635106f1176af01bdd48c052742d28e2198e518c32ece4f2e7bf6894c9048"} Dec 03 20:11:17.559765 master-0 kubenswrapper[29252]: I1203 20:11:17.559736 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"25b30aeeeea90016097fcbcf0481613ecc5289680afff73d19f13ec54b2657f8"} Dec 03 20:11:17.560551 master-0 kubenswrapper[29252]: I1203 20:11:17.559911 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:11:17.579938 master-0 kubenswrapper[29252]: I1203 20:11:17.579841 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.579826099 podStartE2EDuration="2.579826099s" podCreationTimestamp="2025-12-03 20:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:11:17.578061928 +0000 UTC m=+112.391606881" watchObservedRunningTime="2025-12-03 20:11:17.579826099 +0000 UTC m=+112.393371052" Dec 03 20:11:20.416642 master-0 kubenswrapper[29252]: I1203 20:11:20.416526 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:20.438716 master-0 kubenswrapper[29252]: I1203 20:11:20.438642 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d6c90da7-0579-4af1-93f5-4385447e54ff" Dec 03 20:11:20.438716 master-0 kubenswrapper[29252]: I1203 20:11:20.438684 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d6c90da7-0579-4af1-93f5-4385447e54ff" Dec 03 20:11:20.449377 master-0 kubenswrapper[29252]: I1203 20:11:20.449335 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:11:20.451712 master-0 kubenswrapper[29252]: I1203 20:11:20.451685 29252 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:20.454330 master-0 kubenswrapper[29252]: I1203 20:11:20.454291 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:11:20.464896 master-0 kubenswrapper[29252]: I1203 20:11:20.464838 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:20.469424 master-0 kubenswrapper[29252]: I1203 20:11:20.469333 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:11:20.491429 master-0 kubenswrapper[29252]: W1203 20:11:20.491359 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb0810126310d28fb5532674012978b.slice/crio-a4f8eda8bc8a7d5d58c022c0f08cee0f9be1154c9ebe02d72dd494bee2d9d3f1 WatchSource:0}: Error finding container a4f8eda8bc8a7d5d58c022c0f08cee0f9be1154c9ebe02d72dd494bee2d9d3f1: Status 404 returned error can't find the container with id a4f8eda8bc8a7d5d58c022c0f08cee0f9be1154c9ebe02d72dd494bee2d9d3f1 Dec 03 20:11:20.583593 master-0 kubenswrapper[29252]: I1203 20:11:20.583410 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"a4f8eda8bc8a7d5d58c022c0f08cee0f9be1154c9ebe02d72dd494bee2d9d3f1"} Dec 03 20:11:21.597123 master-0 kubenswrapper[29252]: I1203 20:11:21.597053 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb"} Dec 03 20:11:21.597838 master-0 kubenswrapper[29252]: I1203 20:11:21.597117 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5"} Dec 03 20:11:21.597838 master-0 kubenswrapper[29252]: I1203 20:11:21.597292 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"d2a9fff66cc8aec805af934297108d64fdaa0ffb64bc75c967dcb4742c7e5f5f"} Dec 03 20:11:22.609835 master-0 kubenswrapper[29252]: I1203 20:11:22.609718 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea"} Dec 03 20:11:22.642214 master-0 kubenswrapper[29252]: I1203 20:11:22.642112 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.642087853 podStartE2EDuration="2.642087853s" podCreationTimestamp="2025-12-03 20:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:11:22.63884845 +0000 UTC m=+117.452393463" watchObservedRunningTime="2025-12-03 20:11:22.642087853 +0000 UTC m=+117.455632846" Dec 03 20:11:25.442936 master-0 kubenswrapper[29252]: I1203 20:11:25.442865 29252 scope.go:117] "RemoveContainer" containerID="73fd77c7f3160f50b85cebcaf7773a33c44b0958115b084cb590bef38d48ba5c" Dec 03 20:11:25.471072 master-0 kubenswrapper[29252]: I1203 20:11:25.471003 29252 scope.go:117] "RemoveContainer" containerID="a72510073f92e9ff068e8652b1a65285f64ee333e40d80be23e60bf13a3ce72d" Dec 03 20:11:25.494429 master-0 kubenswrapper[29252]: I1203 20:11:25.494331 29252 scope.go:117] "RemoveContainer" containerID="384902c9d5118b992b516df4665219d1bebf7324327cde78b939566df8720f4b" Dec 03 20:11:30.465708 master-0 kubenswrapper[29252]: I1203 20:11:30.465615 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.466913 master-0 kubenswrapper[29252]: I1203 20:11:30.466086 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.466913 master-0 kubenswrapper[29252]: I1203 20:11:30.466120 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.466913 master-0 kubenswrapper[29252]: I1203 20:11:30.466299 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.472516 master-0 kubenswrapper[29252]: I1203 20:11:30.472091 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.472516 master-0 kubenswrapper[29252]: I1203 20:11:30.472467 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:30.695244 master-0 kubenswrapper[29252]: I1203 20:11:30.695145 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:31.704608 master-0 kubenswrapper[29252]: I1203 20:11:31.704521 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:11:44.399198 master-0 kubenswrapper[29252]: I1203 20:11:44.399051 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:11:44.399766 master-0 kubenswrapper[29252]: I1203 20:11:44.399396 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver" containerID="cri-o://22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859" gracePeriod=15 Dec 03 20:11:44.399766 master-0 kubenswrapper[29252]: I1203 20:11:44.399525 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-syncer" containerID="cri-o://009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9" gracePeriod=15 Dec 03 20:11:44.399766 master-0 kubenswrapper[29252]: I1203 20:11:44.399492 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080" gracePeriod=15 Dec 03 20:11:44.399766 master-0 kubenswrapper[29252]: I1203 20:11:44.399540 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" containerID="cri-o://76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54" gracePeriod=15 Dec 03 20:11:44.399766 master-0 kubenswrapper[29252]: I1203 20:11:44.399710 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1" gracePeriod=15 Dec 03 20:11:44.408278 master-0 kubenswrapper[29252]: I1203 20:11:44.408202 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:11:44.408592 master-0 kubenswrapper[29252]: E1203 20:11:44.408549 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-insecure-readyz" Dec 03 20:11:44.408592 master-0 kubenswrapper[29252]: I1203 20:11:44.408578 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-insecure-readyz" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408608 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="setup" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408618 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="setup" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408631 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408639 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408652 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408659 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408680 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592c876e-547f-4a90-b590-d0960feade3d" containerName="installer" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408687 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="592c876e-547f-4a90-b590-d0960feade3d" containerName="installer" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408699 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408709 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: E1203 20:11:44.408724 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.408700 master-0 kubenswrapper[29252]: I1203 20:11:44.408735 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: E1203 20:11:44.408747 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-syncer" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408756 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-syncer" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408886 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408905 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408918 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408927 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-cert-syncer" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408942 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="592c876e-547f-4a90-b590-d0960feade3d" containerName="installer" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408965 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-check-endpoints" Dec 03 20:11:44.409191 master-0 kubenswrapper[29252]: I1203 20:11:44.408977 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3433149c0833909dd6c97d45272ed" containerName="kube-apiserver-insecure-readyz" Dec 03 20:11:44.411445 master-0 kubenswrapper[29252]: I1203 20:11:44.411397 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:11:44.412204 master-0 kubenswrapper[29252]: I1203 20:11:44.412168 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.429513 master-0 kubenswrapper[29252]: I1203 20:11:44.429453 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="efa3433149c0833909dd6c97d45272ed" podUID="382c2026eb84cf3d7672e1fe1646be64" Dec 03 20:11:44.494479 master-0 kubenswrapper[29252]: I1203 20:11:44.494396 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.494479 master-0 kubenswrapper[29252]: I1203 20:11:44.494457 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.494479 master-0 kubenswrapper[29252]: I1203 20:11:44.494492 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.494815 master-0 kubenswrapper[29252]: I1203 20:11:44.494541 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.494815 master-0 kubenswrapper[29252]: I1203 20:11:44.494558 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.494815 master-0 kubenswrapper[29252]: I1203 20:11:44.494573 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.494815 master-0 kubenswrapper[29252]: I1203 20:11:44.494592 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.494815 master-0 kubenswrapper[29252]: I1203 20:11:44.494619 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596424 master-0 kubenswrapper[29252]: I1203 20:11:44.596269 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596424 master-0 kubenswrapper[29252]: I1203 20:11:44.596388 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596424 master-0 kubenswrapper[29252]: I1203 20:11:44.596402 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596459 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596490 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596529 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596584 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596601 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596642 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596680 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596714 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596755 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596760 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596702 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.596951 master-0 kubenswrapper[29252]: I1203 20:11:44.596890 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.597668 master-0 kubenswrapper[29252]: I1203 20:11:44.597012 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:44.801208 master-0 kubenswrapper[29252]: I1203 20:11:44.801088 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-check-endpoints/1.log" Dec 03 20:11:44.802571 master-0 kubenswrapper[29252]: I1203 20:11:44.802540 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-cert-syncer/0.log" Dec 03 20:11:44.803552 master-0 kubenswrapper[29252]: I1203 20:11:44.803496 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54" exitCode=0 Dec 03 20:11:44.803552 master-0 kubenswrapper[29252]: I1203 20:11:44.803533 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080" exitCode=0 Dec 03 20:11:44.803552 master-0 kubenswrapper[29252]: I1203 20:11:44.803543 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1" exitCode=0 Dec 03 20:11:44.803552 master-0 kubenswrapper[29252]: I1203 20:11:44.803551 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9" exitCode=2 Dec 03 20:11:44.803842 master-0 kubenswrapper[29252]: I1203 20:11:44.803579 29252 scope.go:117] "RemoveContainer" containerID="ee2aaab9b8550f344df3e7445ae5d2dcb743224979d469841109025bb15970fd" Dec 03 20:11:44.806584 master-0 kubenswrapper[29252]: I1203 20:11:44.806519 29252 generic.go:334] "Generic (PLEG): container finished" podID="bfb85302-c965-417f-8c35-9aff2e464281" containerID="9e2571a613c1c3cac0791e5a3b10370d1d77ae345a0fbb29706cdeb1555a8b96" exitCode=0 Dec 03 20:11:44.806584 master-0 kubenswrapper[29252]: I1203 20:11:44.806571 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"bfb85302-c965-417f-8c35-9aff2e464281","Type":"ContainerDied","Data":"9e2571a613c1c3cac0791e5a3b10370d1d77ae345a0fbb29706cdeb1555a8b96"} Dec 03 20:11:44.807865 master-0 kubenswrapper[29252]: I1203 20:11:44.807808 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.855206 master-0 kubenswrapper[29252]: E1203 20:11:44.855131 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.856062 master-0 kubenswrapper[29252]: E1203 20:11:44.856000 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.856977 master-0 kubenswrapper[29252]: E1203 20:11:44.856818 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.857829 master-0 kubenswrapper[29252]: E1203 20:11:44.857722 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.859068 master-0 kubenswrapper[29252]: E1203 20:11:44.858860 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:44.859068 master-0 kubenswrapper[29252]: I1203 20:11:44.858927 29252 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 20:11:44.859987 master-0 kubenswrapper[29252]: E1203 20:11:44.859903 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 03 20:11:45.061242 master-0 kubenswrapper[29252]: E1203 20:11:45.061044 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 03 20:11:45.422742 master-0 kubenswrapper[29252]: I1203 20:11:45.422665 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:45.462971 master-0 kubenswrapper[29252]: E1203 20:11:45.462875 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 03 20:11:45.820181 master-0 kubenswrapper[29252]: I1203 20:11:45.820002 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-cert-syncer/0.log" Dec 03 20:11:46.227420 master-0 kubenswrapper[29252]: I1203 20:11:46.227347 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:11:46.229280 master-0 kubenswrapper[29252]: I1203 20:11:46.229199 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.264593 master-0 kubenswrapper[29252]: E1203 20:11:46.264516 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 03 20:11:46.320465 master-0 kubenswrapper[29252]: I1203 20:11:46.320393 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock\") pod \"bfb85302-c965-417f-8c35-9aff2e464281\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " Dec 03 20:11:46.320674 master-0 kubenswrapper[29252]: I1203 20:11:46.320486 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access\") pod \"bfb85302-c965-417f-8c35-9aff2e464281\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " Dec 03 20:11:46.320674 master-0 kubenswrapper[29252]: I1203 20:11:46.320520 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir\") pod \"bfb85302-c965-417f-8c35-9aff2e464281\" (UID: \"bfb85302-c965-417f-8c35-9aff2e464281\") " Dec 03 20:11:46.320674 master-0 kubenswrapper[29252]: I1203 20:11:46.320552 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock" (OuterVolumeSpecName: "var-lock") pod "bfb85302-c965-417f-8c35-9aff2e464281" (UID: "bfb85302-c965-417f-8c35-9aff2e464281"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:46.320870 master-0 kubenswrapper[29252]: I1203 20:11:46.320696 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bfb85302-c965-417f-8c35-9aff2e464281" (UID: "bfb85302-c965-417f-8c35-9aff2e464281"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:46.321091 master-0 kubenswrapper[29252]: I1203 20:11:46.321055 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.321091 master-0 kubenswrapper[29252]: I1203 20:11:46.321085 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bfb85302-c965-417f-8c35-9aff2e464281-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.325608 master-0 kubenswrapper[29252]: I1203 20:11:46.325405 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bfb85302-c965-417f-8c35-9aff2e464281" (UID: "bfb85302-c965-417f-8c35-9aff2e464281"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:11:46.422627 master-0 kubenswrapper[29252]: I1203 20:11:46.422530 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb85302-c965-417f-8c35-9aff2e464281-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.749656 master-0 kubenswrapper[29252]: I1203 20:11:46.749578 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-cert-syncer/0.log" Dec 03 20:11:46.751010 master-0 kubenswrapper[29252]: I1203 20:11:46.750954 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:46.752413 master-0 kubenswrapper[29252]: I1203 20:11:46.752323 29252 status_manager.go:851] "Failed to get status for pod" podUID="efa3433149c0833909dd6c97d45272ed" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.753299 master-0 kubenswrapper[29252]: I1203 20:11:46.753226 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.828640 master-0 kubenswrapper[29252]: I1203 20:11:46.828523 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") pod \"efa3433149c0833909dd6c97d45272ed\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " Dec 03 20:11:46.828640 master-0 kubenswrapper[29252]: I1203 20:11:46.828596 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") pod \"efa3433149c0833909dd6c97d45272ed\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.828686 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") pod \"efa3433149c0833909dd6c97d45272ed\" (UID: \"efa3433149c0833909dd6c97d45272ed\") " Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.828734 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "efa3433149c0833909dd6c97d45272ed" (UID: "efa3433149c0833909dd6c97d45272ed"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.828671 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "efa3433149c0833909dd6c97d45272ed" (UID: "efa3433149c0833909dd6c97d45272ed"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.828908 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "efa3433149c0833909dd6c97d45272ed" (UID: "efa3433149c0833909dd6c97d45272ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.829021 29252 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.829094 master-0 kubenswrapper[29252]: I1203 20:11:46.829042 29252 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.832186 master-0 kubenswrapper[29252]: I1203 20:11:46.832111 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"bfb85302-c965-417f-8c35-9aff2e464281","Type":"ContainerDied","Data":"a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5"} Dec 03 20:11:46.832186 master-0 kubenswrapper[29252]: I1203 20:11:46.832141 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 20:11:46.832186 master-0 kubenswrapper[29252]: I1203 20:11:46.832169 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a835a03167da55e2d0c5aa4c861edfc59d0f4ed81b8a01a8e922e73089c258b5" Dec 03 20:11:46.836136 master-0 kubenswrapper[29252]: I1203 20:11:46.836078 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_efa3433149c0833909dd6c97d45272ed/kube-apiserver-cert-syncer/0.log" Dec 03 20:11:46.837430 master-0 kubenswrapper[29252]: I1203 20:11:46.837372 29252 generic.go:334] "Generic (PLEG): container finished" podID="efa3433149c0833909dd6c97d45272ed" containerID="22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859" exitCode=0 Dec 03 20:11:46.837430 master-0 kubenswrapper[29252]: I1203 20:11:46.837433 29252 scope.go:117] "RemoveContainer" containerID="76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54" Dec 03 20:11:46.837688 master-0 kubenswrapper[29252]: I1203 20:11:46.837577 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:46.862342 master-0 kubenswrapper[29252]: I1203 20:11:46.862235 29252 scope.go:117] "RemoveContainer" containerID="eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080" Dec 03 20:11:46.885204 master-0 kubenswrapper[29252]: I1203 20:11:46.885083 29252 status_manager.go:851] "Failed to get status for pod" podUID="efa3433149c0833909dd6c97d45272ed" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.887377 master-0 kubenswrapper[29252]: I1203 20:11:46.887237 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.888518 master-0 kubenswrapper[29252]: I1203 20:11:46.888432 29252 status_manager.go:851] "Failed to get status for pod" podUID="efa3433149c0833909dd6c97d45272ed" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.889367 master-0 kubenswrapper[29252]: I1203 20:11:46.889279 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:46.902594 master-0 kubenswrapper[29252]: I1203 20:11:46.902534 29252 scope.go:117] "RemoveContainer" containerID="906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1" Dec 03 20:11:46.923357 master-0 kubenswrapper[29252]: I1203 20:11:46.923290 29252 scope.go:117] "RemoveContainer" containerID="009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9" Dec 03 20:11:46.930495 master-0 kubenswrapper[29252]: I1203 20:11:46.930405 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/efa3433149c0833909dd6c97d45272ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:11:46.946740 master-0 kubenswrapper[29252]: I1203 20:11:46.946006 29252 scope.go:117] "RemoveContainer" containerID="22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859" Dec 03 20:11:46.965836 master-0 kubenswrapper[29252]: I1203 20:11:46.965749 29252 scope.go:117] "RemoveContainer" containerID="b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35" Dec 03 20:11:46.983842 master-0 kubenswrapper[29252]: I1203 20:11:46.983752 29252 scope.go:117] "RemoveContainer" containerID="76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54" Dec 03 20:11:46.984271 master-0 kubenswrapper[29252]: E1203 20:11:46.984211 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54\": container with ID starting with 76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54 not found: ID does not exist" containerID="76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54" Dec 03 20:11:46.984271 master-0 kubenswrapper[29252]: I1203 20:11:46.984255 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54"} err="failed to get container status \"76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54\": rpc error: code = NotFound desc = could not find container \"76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54\": container with ID starting with 76f9ca425f52f5ae839d8475d3bce62ca26fad716e336398fd44de61a9633f54 not found: ID does not exist" Dec 03 20:11:46.984520 master-0 kubenswrapper[29252]: I1203 20:11:46.984283 29252 scope.go:117] "RemoveContainer" containerID="eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080" Dec 03 20:11:46.984625 master-0 kubenswrapper[29252]: E1203 20:11:46.984562 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080\": container with ID starting with eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080 not found: ID does not exist" containerID="eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080" Dec 03 20:11:46.984625 master-0 kubenswrapper[29252]: I1203 20:11:46.984588 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080"} err="failed to get container status \"eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080\": rpc error: code = NotFound desc = could not find container \"eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080\": container with ID starting with eaee7d593be548d765bd8f2d4c68f44dae98c436387a0e4202467ebdd81d0080 not found: ID does not exist" Dec 03 20:11:46.984625 master-0 kubenswrapper[29252]: I1203 20:11:46.984605 29252 scope.go:117] "RemoveContainer" containerID="906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1" Dec 03 20:11:46.984936 master-0 kubenswrapper[29252]: E1203 20:11:46.984915 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1\": container with ID starting with 906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1 not found: ID does not exist" containerID="906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1" Dec 03 20:11:46.984936 master-0 kubenswrapper[29252]: I1203 20:11:46.984934 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1"} err="failed to get container status \"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1\": rpc error: code = NotFound desc = could not find container \"906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1\": container with ID starting with 906c3bca6e6dba68fc1f6de40f7c8c547d5581dd0b1d1d54e1eb0216877477d1 not found: ID does not exist" Dec 03 20:11:46.985124 master-0 kubenswrapper[29252]: I1203 20:11:46.984947 29252 scope.go:117] "RemoveContainer" containerID="009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9" Dec 03 20:11:46.985344 master-0 kubenswrapper[29252]: E1203 20:11:46.985288 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9\": container with ID starting with 009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9 not found: ID does not exist" containerID="009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9" Dec 03 20:11:46.985344 master-0 kubenswrapper[29252]: I1203 20:11:46.985318 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9"} err="failed to get container status \"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9\": rpc error: code = NotFound desc = could not find container \"009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9\": container with ID starting with 009859dfb35cd527b6f73d6e859baa9e361df0c4b1b88a554c64f8f7fef46bf9 not found: ID does not exist" Dec 03 20:11:46.985344 master-0 kubenswrapper[29252]: I1203 20:11:46.985337 29252 scope.go:117] "RemoveContainer" containerID="22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859" Dec 03 20:11:46.985874 master-0 kubenswrapper[29252]: E1203 20:11:46.985760 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859\": container with ID starting with 22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859 not found: ID does not exist" containerID="22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859" Dec 03 20:11:46.985999 master-0 kubenswrapper[29252]: I1203 20:11:46.985879 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859"} err="failed to get container status \"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859\": rpc error: code = NotFound desc = could not find container \"22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859\": container with ID starting with 22463a8cc479cad0b18b7fc08c942143546ab5eb5ece91f4b438d1d815531859 not found: ID does not exist" Dec 03 20:11:46.985999 master-0 kubenswrapper[29252]: I1203 20:11:46.985941 29252 scope.go:117] "RemoveContainer" containerID="b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35" Dec 03 20:11:46.986484 master-0 kubenswrapper[29252]: E1203 20:11:46.986423 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35\": container with ID starting with b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35 not found: ID does not exist" containerID="b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35" Dec 03 20:11:46.986570 master-0 kubenswrapper[29252]: I1203 20:11:46.986488 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35"} err="failed to get container status \"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35\": rpc error: code = NotFound desc = could not find container \"b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35\": container with ID starting with b7f227f0c18811b7cbe2379656751997168d72c962c1085bafeb8e91aa107a35 not found: ID does not exist" Dec 03 20:11:47.422814 master-0 kubenswrapper[29252]: I1203 20:11:47.422712 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa3433149c0833909dd6c97d45272ed" path="/var/lib/kubelet/pods/efa3433149c0833909dd6c97d45272ed/volumes" Dec 03 20:11:47.865641 master-0 kubenswrapper[29252]: E1203 20:11:47.865592 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 03 20:11:49.437885 master-0 kubenswrapper[29252]: E1203 20:11:49.437669 29252 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187dcd9d47ef27f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:efa3433149c0833909dd6c97d45272ed,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Killing,Message:Stopping container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 20:11:44.399517684 +0000 UTC m=+139.213062647,LastTimestamp:2025-12-03 20:11:44.399517684 +0000 UTC m=+139.213062647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:11:49.451322 master-0 kubenswrapper[29252]: E1203 20:11:49.451200 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:49.451867 master-0 kubenswrapper[29252]: I1203 20:11:49.451743 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:49.488462 master-0 kubenswrapper[29252]: W1203 20:11:49.488391 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbad610cd8689b0972c02840bb486a62.slice/crio-e4b37a5c781587709eb05e3a0ac7c7cf157c807218206b5ca5e6d56f75bd442b WatchSource:0}: Error finding container e4b37a5c781587709eb05e3a0ac7c7cf157c807218206b5ca5e6d56f75bd442b: Status 404 returned error can't find the container with id e4b37a5c781587709eb05e3a0ac7c7cf157c807218206b5ca5e6d56f75bd442b Dec 03 20:11:49.865288 master-0 kubenswrapper[29252]: I1203 20:11:49.865235 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"cbad610cd8689b0972c02840bb486a62","Type":"ContainerStarted","Data":"49f0b324bc2753e4944ad48b25a1c7c849e6e6354efcd8b10464f50e49e0bdc5"} Dec 03 20:11:49.865288 master-0 kubenswrapper[29252]: I1203 20:11:49.865286 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"cbad610cd8689b0972c02840bb486a62","Type":"ContainerStarted","Data":"e4b37a5c781587709eb05e3a0ac7c7cf157c807218206b5ca5e6d56f75bd442b"} Dec 03 20:11:49.866174 master-0 kubenswrapper[29252]: I1203 20:11:49.866129 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:49.866219 master-0 kubenswrapper[29252]: E1203 20:11:49.866183 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:11:51.067549 master-0 kubenswrapper[29252]: E1203 20:11:51.067456 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 03 20:11:53.973442 master-0 kubenswrapper[29252]: E1203 20:11:53.973248 29252 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187dcd9d47ef27f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:efa3433149c0833909dd6c97d45272ed,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Killing,Message:Stopping container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 20:11:44.399517684 +0000 UTC m=+139.213062647,LastTimestamp:2025-12-03 20:11:44.399517684 +0000 UTC m=+139.213062647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:11:55.418840 master-0 kubenswrapper[29252]: I1203 20:11:55.418679 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:56.415985 master-0 kubenswrapper[29252]: I1203 20:11:56.415904 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:56.417526 master-0 kubenswrapper[29252]: I1203 20:11:56.417425 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:56.439677 master-0 kubenswrapper[29252]: I1203 20:11:56.439588 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:56.439677 master-0 kubenswrapper[29252]: I1203 20:11:56.439644 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:56.440895 master-0 kubenswrapper[29252]: E1203 20:11:56.440808 29252 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:56.441554 master-0 kubenswrapper[29252]: I1203 20:11:56.441499 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:56.478261 master-0 kubenswrapper[29252]: W1203 20:11:56.478215 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382c2026eb84cf3d7672e1fe1646be64.slice/crio-d74cd74e464fd62b20c7e7f4b822f898761ee7313dcf2f5e14f219248f595bfe WatchSource:0}: Error finding container d74cd74e464fd62b20c7e7f4b822f898761ee7313dcf2f5e14f219248f595bfe: Status 404 returned error can't find the container with id d74cd74e464fd62b20c7e7f4b822f898761ee7313dcf2f5e14f219248f595bfe Dec 03 20:11:56.926056 master-0 kubenswrapper[29252]: I1203 20:11:56.925938 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b" exitCode=0 Dec 03 20:11:56.926253 master-0 kubenswrapper[29252]: I1203 20:11:56.926105 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerDied","Data":"875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b"} Dec 03 20:11:56.926369 master-0 kubenswrapper[29252]: I1203 20:11:56.926352 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"d74cd74e464fd62b20c7e7f4b822f898761ee7313dcf2f5e14f219248f595bfe"} Dec 03 20:11:56.927101 master-0 kubenswrapper[29252]: I1203 20:11:56.927042 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:56.927101 master-0 kubenswrapper[29252]: I1203 20:11:56.927091 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:56.928060 master-0 kubenswrapper[29252]: I1203 20:11:56.928036 29252 status_manager.go:851] "Failed to get status for pod" podUID="bfb85302-c965-417f-8c35-9aff2e464281" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:11:56.928245 master-0 kubenswrapper[29252]: E1203 20:11:56.928187 29252 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:57.971168 master-0 kubenswrapper[29252]: I1203 20:11:57.971099 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717"} Dec 03 20:11:57.971168 master-0 kubenswrapper[29252]: I1203 20:11:57.971160 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9"} Dec 03 20:11:57.971168 master-0 kubenswrapper[29252]: I1203 20:11:57.971174 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2"} Dec 03 20:11:58.985164 master-0 kubenswrapper[29252]: I1203 20:11:58.985112 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137"} Dec 03 20:11:58.985164 master-0 kubenswrapper[29252]: I1203 20:11:58.985167 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"382c2026eb84cf3d7672e1fe1646be64","Type":"ContainerStarted","Data":"e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f"} Dec 03 20:11:58.985672 master-0 kubenswrapper[29252]: I1203 20:11:58.985443 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:58.985672 master-0 kubenswrapper[29252]: I1203 20:11:58.985461 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:11:58.985765 master-0 kubenswrapper[29252]: I1203 20:11:58.985719 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:11:59.993130 master-0 kubenswrapper[29252]: I1203 20:11:59.993064 29252 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="ac900c0e3bb1d9c962bbb16a701da09c17b23c0e09631a6ada5617d6d0661d7b" exitCode=0 Dec 03 20:11:59.993731 master-0 kubenswrapper[29252]: I1203 20:11:59.993178 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerDied","Data":"ac900c0e3bb1d9c962bbb16a701da09c17b23c0e09631a6ada5617d6d0661d7b"} Dec 03 20:11:59.993731 master-0 kubenswrapper[29252]: I1203 20:11:59.993269 29252 scope.go:117] "RemoveContainer" containerID="9ee7a9ba017971cc72c48a14fbe564128a44ff608d460db457bf85730f38fd52" Dec 03 20:11:59.993731 master-0 kubenswrapper[29252]: I1203 20:11:59.993689 29252 scope.go:117] "RemoveContainer" containerID="ac900c0e3bb1d9c962bbb16a701da09c17b23c0e09631a6ada5617d6d0661d7b" Dec 03 20:11:59.997307 master-0 kubenswrapper[29252]: I1203 20:11:59.997260 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/0.log" Dec 03 20:11:59.997376 master-0 kubenswrapper[29252]: I1203 20:11:59.997333 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="d2a9fff66cc8aec805af934297108d64fdaa0ffb64bc75c967dcb4742c7e5f5f" exitCode=1 Dec 03 20:12:00.001798 master-0 kubenswrapper[29252]: I1203 20:11:59.997409 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerDied","Data":"d2a9fff66cc8aec805af934297108d64fdaa0ffb64bc75c967dcb4742c7e5f5f"} Dec 03 20:12:00.001798 master-0 kubenswrapper[29252]: I1203 20:11:59.998187 29252 scope.go:117] "RemoveContainer" containerID="d2a9fff66cc8aec805af934297108d64fdaa0ffb64bc75c967dcb4742c7e5f5f" Dec 03 20:12:00.466193 master-0 kubenswrapper[29252]: I1203 20:12:00.466124 29252 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:00.466481 master-0 kubenswrapper[29252]: I1203 20:12:00.466234 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:00.466481 master-0 kubenswrapper[29252]: I1203 20:12:00.466276 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:01.007293 master-0 kubenswrapper[29252]: I1203 20:12:01.007206 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98"} Dec 03 20:12:01.013699 master-0 kubenswrapper[29252]: I1203 20:12:01.013626 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/0.log" Dec 03 20:12:01.013895 master-0 kubenswrapper[29252]: I1203 20:12:01.013726 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} Dec 03 20:12:01.441893 master-0 kubenswrapper[29252]: I1203 20:12:01.441819 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:01.441893 master-0 kubenswrapper[29252]: I1203 20:12:01.441901 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:01.448904 master-0 kubenswrapper[29252]: I1203 20:12:01.448851 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:02.020896 master-0 kubenswrapper[29252]: I1203 20:12:02.020844 29252 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98" exitCode=0 Dec 03 20:12:02.021587 master-0 kubenswrapper[29252]: I1203 20:12:02.020919 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerDied","Data":"8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98"} Dec 03 20:12:02.021587 master-0 kubenswrapper[29252]: I1203 20:12:02.020972 29252 scope.go:117] "RemoveContainer" containerID="ac900c0e3bb1d9c962bbb16a701da09c17b23c0e09631a6ada5617d6d0661d7b" Dec 03 20:12:02.021587 master-0 kubenswrapper[29252]: I1203 20:12:02.021418 29252 scope.go:117] "RemoveContainer" containerID="8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98" Dec 03 20:12:02.021893 master-0 kubenswrapper[29252]: E1203 20:12:02.021626 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-59d99f9b7b-h64kt_openshift-insights(af2023e1-9c7a-40af-a6bf-fba31c3565b1)\"" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" podUID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" Dec 03 20:12:04.015395 master-0 kubenswrapper[29252]: I1203 20:12:04.015338 29252 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:04.037376 master-0 kubenswrapper[29252]: I1203 20:12:04.037320 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:04.037376 master-0 kubenswrapper[29252]: I1203 20:12:04.037354 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:04.041140 master-0 kubenswrapper[29252]: I1203 20:12:04.041095 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:05.042525 master-0 kubenswrapper[29252]: I1203 20:12:05.042456 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:05.042525 master-0 kubenswrapper[29252]: I1203 20:12:05.042492 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:05.449225 master-0 kubenswrapper[29252]: I1203 20:12:05.449155 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="382c2026eb84cf3d7672e1fe1646be64" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:12:05.714544 master-0 kubenswrapper[29252]: I1203 20:12:05.714393 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 20:12:10.466340 master-0 kubenswrapper[29252]: I1203 20:12:10.466221 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:10.466340 master-0 kubenswrapper[29252]: I1203 20:12:10.466320 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:10.475970 master-0 kubenswrapper[29252]: I1203 20:12:10.475885 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:11.093817 master-0 kubenswrapper[29252]: I1203 20:12:11.093696 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:12:14.417311 master-0 kubenswrapper[29252]: I1203 20:12:14.417225 29252 scope.go:117] "RemoveContainer" containerID="8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98" Dec 03 20:12:14.752755 master-0 kubenswrapper[29252]: I1203 20:12:14.752675 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 20:12:15.121306 master-0 kubenswrapper[29252]: I1203 20:12:15.121257 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8"} Dec 03 20:12:15.340806 master-0 kubenswrapper[29252]: I1203 20:12:15.340646 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 20:12:15.963189 master-0 kubenswrapper[29252]: I1203 20:12:15.963085 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 20:12:16.134375 master-0 kubenswrapper[29252]: I1203 20:12:16.134276 29252 generic.go:334] "Generic (PLEG): container finished" podID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" containerID="8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8" exitCode=0 Dec 03 20:12:16.134658 master-0 kubenswrapper[29252]: I1203 20:12:16.134357 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerDied","Data":"8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8"} Dec 03 20:12:16.134658 master-0 kubenswrapper[29252]: I1203 20:12:16.134546 29252 scope.go:117] "RemoveContainer" containerID="8eb6c97ced72ac909512a8fb9a80e8c3a600e09b1d0a180070de51dc266f3d98" Dec 03 20:12:16.135646 master-0 kubenswrapper[29252]: I1203 20:12:16.135598 29252 scope.go:117] "RemoveContainer" containerID="8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8" Dec 03 20:12:16.136368 master-0 kubenswrapper[29252]: E1203 20:12:16.136317 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-59d99f9b7b-h64kt_openshift-insights(af2023e1-9c7a-40af-a6bf-fba31c3565b1)\"" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" podUID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" Dec 03 20:12:16.311003 master-0 kubenswrapper[29252]: I1203 20:12:16.310850 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 20:12:16.533073 master-0 kubenswrapper[29252]: I1203 20:12:16.533002 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 20:12:16.582132 master-0 kubenswrapper[29252]: I1203 20:12:16.581954 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 20:12:16.671393 master-0 kubenswrapper[29252]: I1203 20:12:16.671308 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 20:12:16.858648 master-0 kubenswrapper[29252]: I1203 20:12:16.858585 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 20:12:17.016593 master-0 kubenswrapper[29252]: I1203 20:12:17.016476 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 20:12:17.120570 master-0 kubenswrapper[29252]: I1203 20:12:17.120127 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 20:12:17.376819 master-0 kubenswrapper[29252]: I1203 20:12:17.376582 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 20:12:17.828563 master-0 kubenswrapper[29252]: I1203 20:12:17.828346 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 20:12:17.922774 master-0 kubenswrapper[29252]: I1203 20:12:17.922708 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 20:12:18.054260 master-0 kubenswrapper[29252]: I1203 20:12:18.054152 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 20:12:18.091561 master-0 kubenswrapper[29252]: I1203 20:12:18.091354 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 20:12:18.124417 master-0 kubenswrapper[29252]: I1203 20:12:18.124332 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 20:12:18.168084 master-0 kubenswrapper[29252]: I1203 20:12:18.168013 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 20:12:18.370579 master-0 kubenswrapper[29252]: I1203 20:12:18.370512 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 20:12:18.372301 master-0 kubenswrapper[29252]: I1203 20:12:18.372257 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 20:12:18.485427 master-0 kubenswrapper[29252]: I1203 20:12:18.485287 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 20:12:18.674350 master-0 kubenswrapper[29252]: I1203 20:12:18.674171 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 20:12:18.699086 master-0 kubenswrapper[29252]: I1203 20:12:18.697944 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 20:12:18.792428 master-0 kubenswrapper[29252]: I1203 20:12:18.792373 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 20:12:18.933000 master-0 kubenswrapper[29252]: I1203 20:12:18.932818 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 20:12:19.007698 master-0 kubenswrapper[29252]: I1203 20:12:19.007643 29252 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 20:12:19.038024 master-0 kubenswrapper[29252]: I1203 20:12:19.037925 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 20:12:19.066971 master-0 kubenswrapper[29252]: I1203 20:12:19.066930 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 20:12:19.067520 master-0 kubenswrapper[29252]: I1203 20:12:19.067244 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 20:12:19.288564 master-0 kubenswrapper[29252]: I1203 20:12:19.288465 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 20:12:19.309523 master-0 kubenswrapper[29252]: I1203 20:12:19.309457 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 20:12:19.334308 master-0 kubenswrapper[29252]: I1203 20:12:19.334265 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 20:12:19.400699 master-0 kubenswrapper[29252]: I1203 20:12:19.400646 29252 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 20:12:19.444868 master-0 kubenswrapper[29252]: I1203 20:12:19.444800 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 20:12:19.515223 master-0 kubenswrapper[29252]: I1203 20:12:19.515182 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 20:12:19.541293 master-0 kubenswrapper[29252]: I1203 20:12:19.541170 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 20:12:19.677181 master-0 kubenswrapper[29252]: I1203 20:12:19.677054 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ngglc" Dec 03 20:12:19.755961 master-0 kubenswrapper[29252]: I1203 20:12:19.755877 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 20:12:19.774284 master-0 kubenswrapper[29252]: I1203 20:12:19.774184 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 20:12:20.025816 master-0 kubenswrapper[29252]: I1203 20:12:20.025680 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 20:12:20.056451 master-0 kubenswrapper[29252]: I1203 20:12:20.056355 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 20:12:20.126012 master-0 kubenswrapper[29252]: I1203 20:12:20.125941 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 20:12:20.187385 master-0 kubenswrapper[29252]: I1203 20:12:20.187109 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 20:12:20.411646 master-0 kubenswrapper[29252]: I1203 20:12:20.411606 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 20:12:20.418874 master-0 kubenswrapper[29252]: I1203 20:12:20.418772 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 20:12:20.586268 master-0 kubenswrapper[29252]: I1203 20:12:20.586174 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 20:12:20.658221 master-0 kubenswrapper[29252]: I1203 20:12:20.658153 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-ztlqb" Dec 03 20:12:20.713395 master-0 kubenswrapper[29252]: I1203 20:12:20.713216 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 20:12:20.721878 master-0 kubenswrapper[29252]: I1203 20:12:20.721612 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 20:12:20.738694 master-0 kubenswrapper[29252]: I1203 20:12:20.738616 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 20:12:20.874124 master-0 kubenswrapper[29252]: I1203 20:12:20.874035 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 20:12:20.963526 master-0 kubenswrapper[29252]: I1203 20:12:20.963379 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 20:12:21.107485 master-0 kubenswrapper[29252]: I1203 20:12:21.107411 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-l56l4" Dec 03 20:12:21.117345 master-0 kubenswrapper[29252]: I1203 20:12:21.117194 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 20:12:21.213500 master-0 kubenswrapper[29252]: I1203 20:12:21.213405 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:12:21.238503 master-0 kubenswrapper[29252]: I1203 20:12:21.238304 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 20:12:21.497483 master-0 kubenswrapper[29252]: I1203 20:12:21.497282 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 20:12:21.616347 master-0 kubenswrapper[29252]: I1203 20:12:21.616239 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 20:12:21.641837 master-0 kubenswrapper[29252]: I1203 20:12:21.640681 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 20:12:21.709620 master-0 kubenswrapper[29252]: I1203 20:12:21.709556 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 20:12:21.799355 master-0 kubenswrapper[29252]: I1203 20:12:21.799189 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 20:12:21.922534 master-0 kubenswrapper[29252]: I1203 20:12:21.922476 29252 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 20:12:22.095725 master-0 kubenswrapper[29252]: I1203 20:12:22.095603 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 20:12:22.097745 master-0 kubenswrapper[29252]: I1203 20:12:22.097705 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 20:12:22.225794 master-0 kubenswrapper[29252]: I1203 20:12:22.225730 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 20:12:22.296500 master-0 kubenswrapper[29252]: I1203 20:12:22.296404 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 20:12:22.307258 master-0 kubenswrapper[29252]: I1203 20:12:22.307181 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 20:12:22.453669 master-0 kubenswrapper[29252]: I1203 20:12:22.453594 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 20:12:22.508983 master-0 kubenswrapper[29252]: I1203 20:12:22.508880 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 20:12:22.523003 master-0 kubenswrapper[29252]: I1203 20:12:22.522918 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 20:12:22.525734 master-0 kubenswrapper[29252]: I1203 20:12:22.525657 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 20:12:22.527036 master-0 kubenswrapper[29252]: I1203 20:12:22.526961 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 20:12:22.569491 master-0 kubenswrapper[29252]: I1203 20:12:22.569404 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 20:12:22.585189 master-0 kubenswrapper[29252]: I1203 20:12:22.585115 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 20:12:22.664000 master-0 kubenswrapper[29252]: I1203 20:12:22.663923 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 20:12:22.666649 master-0 kubenswrapper[29252]: I1203 20:12:22.666575 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 20:12:22.830597 master-0 kubenswrapper[29252]: I1203 20:12:22.830408 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 20:12:22.851410 master-0 kubenswrapper[29252]: I1203 20:12:22.851343 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 20:12:22.884300 master-0 kubenswrapper[29252]: I1203 20:12:22.884231 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 20:12:23.014504 master-0 kubenswrapper[29252]: I1203 20:12:23.014441 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 20:12:23.017493 master-0 kubenswrapper[29252]: I1203 20:12:23.017430 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 20:12:23.110967 master-0 kubenswrapper[29252]: I1203 20:12:23.110900 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 20:12:23.200707 master-0 kubenswrapper[29252]: I1203 20:12:23.200614 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 20:12:23.312144 master-0 kubenswrapper[29252]: I1203 20:12:23.312056 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 20:12:23.347397 master-0 kubenswrapper[29252]: I1203 20:12:23.347291 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-krxhq" Dec 03 20:12:23.532744 master-0 kubenswrapper[29252]: I1203 20:12:23.532546 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 20:12:23.553979 master-0 kubenswrapper[29252]: I1203 20:12:23.553894 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 20:12:23.564976 master-0 kubenswrapper[29252]: I1203 20:12:23.564922 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 20:12:23.567718 master-0 kubenswrapper[29252]: I1203 20:12:23.567625 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 20:12:23.794276 master-0 kubenswrapper[29252]: I1203 20:12:23.793958 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 20:12:23.794569 master-0 kubenswrapper[29252]: I1203 20:12:23.794539 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 20:12:23.968601 master-0 kubenswrapper[29252]: I1203 20:12:23.968520 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 20:12:24.009289 master-0 kubenswrapper[29252]: I1203 20:12:24.009225 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 20:12:24.035550 master-0 kubenswrapper[29252]: I1203 20:12:24.035467 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 20:12:24.064832 master-0 kubenswrapper[29252]: I1203 20:12:24.064666 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 20:12:24.165290 master-0 kubenswrapper[29252]: I1203 20:12:24.163901 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 20:12:24.165290 master-0 kubenswrapper[29252]: I1203 20:12:24.164129 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 20:12:24.413344 master-0 kubenswrapper[29252]: I1203 20:12:24.413251 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 20:12:24.430971 master-0 kubenswrapper[29252]: I1203 20:12:24.430850 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:12:24.471980 master-0 kubenswrapper[29252]: I1203 20:12:24.471919 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 20:12:24.486523 master-0 kubenswrapper[29252]: I1203 20:12:24.486456 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 20:12:24.493453 master-0 kubenswrapper[29252]: I1203 20:12:24.493409 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 20:12:24.538376 master-0 kubenswrapper[29252]: I1203 20:12:24.538292 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 20:12:24.552758 master-0 kubenswrapper[29252]: I1203 20:12:24.552676 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 20:12:24.628225 master-0 kubenswrapper[29252]: I1203 20:12:24.628108 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 20:12:24.667416 master-0 kubenswrapper[29252]: I1203 20:12:24.667272 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 20:12:24.737470 master-0 kubenswrapper[29252]: I1203 20:12:24.737412 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 20:12:24.751963 master-0 kubenswrapper[29252]: I1203 20:12:24.751895 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 20:12:24.784138 master-0 kubenswrapper[29252]: I1203 20:12:24.784078 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 20:12:24.832931 master-0 kubenswrapper[29252]: I1203 20:12:24.832864 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 20:12:24.852988 master-0 kubenswrapper[29252]: I1203 20:12:24.852932 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 20:12:24.863570 master-0 kubenswrapper[29252]: I1203 20:12:24.863517 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:12:24.909154 master-0 kubenswrapper[29252]: I1203 20:12:24.909068 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 20:12:24.933703 master-0 kubenswrapper[29252]: I1203 20:12:24.933546 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 20:12:24.950268 master-0 kubenswrapper[29252]: I1203 20:12:24.950189 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 20:12:25.019825 master-0 kubenswrapper[29252]: I1203 20:12:25.019704 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 20:12:25.110951 master-0 kubenswrapper[29252]: I1203 20:12:25.110873 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 20:12:25.196670 master-0 kubenswrapper[29252]: I1203 20:12:25.196505 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 20:12:25.242312 master-0 kubenswrapper[29252]: I1203 20:12:25.242222 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 20:12:25.265431 master-0 kubenswrapper[29252]: I1203 20:12:25.265360 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 20:12:25.310308 master-0 kubenswrapper[29252]: I1203 20:12:25.310206 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 20:12:25.484243 master-0 kubenswrapper[29252]: I1203 20:12:25.484090 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:12:25.502228 master-0 kubenswrapper[29252]: I1203 20:12:25.496072 29252 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 20:12:25.511293 master-0 kubenswrapper[29252]: I1203 20:12:25.511239 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 20:12:25.647439 master-0 kubenswrapper[29252]: I1203 20:12:25.647368 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 20:12:25.650908 master-0 kubenswrapper[29252]: I1203 20:12:25.650858 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 20:12:25.745379 master-0 kubenswrapper[29252]: I1203 20:12:25.742400 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 20:12:25.752107 master-0 kubenswrapper[29252]: I1203 20:12:25.752062 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 20:12:25.783277 master-0 kubenswrapper[29252]: I1203 20:12:25.783203 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 20:12:25.871713 master-0 kubenswrapper[29252]: I1203 20:12:25.871655 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 20:12:26.014757 master-0 kubenswrapper[29252]: I1203 20:12:26.014617 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 20:12:26.043254 master-0 kubenswrapper[29252]: I1203 20:12:26.043196 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 20:12:26.054517 master-0 kubenswrapper[29252]: I1203 20:12:26.054446 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 20:12:26.110507 master-0 kubenswrapper[29252]: I1203 20:12:26.110452 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 20:12:26.117110 master-0 kubenswrapper[29252]: I1203 20:12:26.117063 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 20:12:26.122866 master-0 kubenswrapper[29252]: I1203 20:12:26.122827 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 20:12:26.156173 master-0 kubenswrapper[29252]: I1203 20:12:26.156137 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 20:12:26.221371 master-0 kubenswrapper[29252]: I1203 20:12:26.221317 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 20:12:26.289969 master-0 kubenswrapper[29252]: I1203 20:12:26.289806 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 20:12:26.371426 master-0 kubenswrapper[29252]: I1203 20:12:26.371360 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 20:12:26.425799 master-0 kubenswrapper[29252]: I1203 20:12:26.425734 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 20:12:26.429227 master-0 kubenswrapper[29252]: I1203 20:12:26.429202 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 20:12:26.478850 master-0 kubenswrapper[29252]: I1203 20:12:26.478811 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 20:12:26.522461 master-0 kubenswrapper[29252]: I1203 20:12:26.522423 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 20:12:26.598759 master-0 kubenswrapper[29252]: I1203 20:12:26.598645 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 20:12:26.619540 master-0 kubenswrapper[29252]: I1203 20:12:26.619492 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:12:26.627765 master-0 kubenswrapper[29252]: I1203 20:12:26.627669 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 20:12:26.721367 master-0 kubenswrapper[29252]: I1203 20:12:26.721258 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 20:12:26.839919 master-0 kubenswrapper[29252]: I1203 20:12:26.839878 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 20:12:26.895437 master-0 kubenswrapper[29252]: I1203 20:12:26.895404 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:12:26.961748 master-0 kubenswrapper[29252]: I1203 20:12:26.961709 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 20:12:27.096079 master-0 kubenswrapper[29252]: I1203 20:12:27.096031 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 20:12:27.121323 master-0 kubenswrapper[29252]: I1203 20:12:27.121275 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 20:12:27.176460 master-0 kubenswrapper[29252]: I1203 20:12:27.176316 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 20:12:27.206158 master-0 kubenswrapper[29252]: I1203 20:12:27.206064 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 20:12:27.220972 master-0 kubenswrapper[29252]: I1203 20:12:27.220770 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 20:12:27.227495 master-0 kubenswrapper[29252]: I1203 20:12:27.227416 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 20:12:27.238021 master-0 kubenswrapper[29252]: I1203 20:12:27.237961 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 20:12:27.241988 master-0 kubenswrapper[29252]: I1203 20:12:27.241586 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 20:12:27.327525 master-0 kubenswrapper[29252]: I1203 20:12:27.327423 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 20:12:27.329025 master-0 kubenswrapper[29252]: I1203 20:12:27.328881 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 20:12:27.336713 master-0 kubenswrapper[29252]: I1203 20:12:27.336639 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 20:12:27.403832 master-0 kubenswrapper[29252]: I1203 20:12:27.403741 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 20:12:27.484267 master-0 kubenswrapper[29252]: I1203 20:12:27.484115 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 20:12:27.576242 master-0 kubenswrapper[29252]: I1203 20:12:27.576119 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 20:12:27.579045 master-0 kubenswrapper[29252]: I1203 20:12:27.578979 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 20:12:27.590381 master-0 kubenswrapper[29252]: I1203 20:12:27.590319 29252 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 20:12:27.601154 master-0 kubenswrapper[29252]: I1203 20:12:27.601092 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 20:12:27.607713 master-0 kubenswrapper[29252]: I1203 20:12:27.607672 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:12:27.607910 master-0 kubenswrapper[29252]: I1203 20:12:27.607892 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:12:27.608606 master-0 kubenswrapper[29252]: I1203 20:12:27.608512 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:27.608687 master-0 kubenswrapper[29252]: I1203 20:12:27.608654 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="633b751f-3f43-4c59-8114-53a2f8659571" Dec 03 20:12:27.615846 master-0 kubenswrapper[29252]: I1203 20:12:27.615762 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:12:27.644456 master-0 kubenswrapper[29252]: I1203 20:12:27.644043 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=23.644019331 podStartE2EDuration="23.644019331s" podCreationTimestamp="2025-12-03 20:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:12:27.639478282 +0000 UTC m=+182.453023325" watchObservedRunningTime="2025-12-03 20:12:27.644019331 +0000 UTC m=+182.457564314" Dec 03 20:12:27.671364 master-0 kubenswrapper[29252]: I1203 20:12:27.671296 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 20:12:27.757709 master-0 kubenswrapper[29252]: I1203 20:12:27.757581 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 20:12:27.824041 master-0 kubenswrapper[29252]: I1203 20:12:27.823989 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 20:12:27.979621 master-0 kubenswrapper[29252]: I1203 20:12:27.979530 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 20:12:27.987655 master-0 kubenswrapper[29252]: I1203 20:12:27.987596 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 20:12:28.078027 master-0 kubenswrapper[29252]: I1203 20:12:28.077892 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 20:12:28.106028 master-0 kubenswrapper[29252]: I1203 20:12:28.105977 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 20:12:28.117847 master-0 kubenswrapper[29252]: I1203 20:12:28.117770 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 20:12:28.170386 master-0 kubenswrapper[29252]: I1203 20:12:28.170294 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 20:12:28.340739 master-0 kubenswrapper[29252]: I1203 20:12:28.340607 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 20:12:28.732169 master-0 kubenswrapper[29252]: I1203 20:12:28.732106 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 20:12:28.913571 master-0 kubenswrapper[29252]: I1203 20:12:28.913511 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 20:12:29.024705 master-0 kubenswrapper[29252]: I1203 20:12:29.024586 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 20:12:29.029020 master-0 kubenswrapper[29252]: I1203 20:12:29.028963 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 20:12:29.080526 master-0 kubenswrapper[29252]: I1203 20:12:29.080451 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 20:12:29.144075 master-0 kubenswrapper[29252]: I1203 20:12:29.142904 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-7tjv7" Dec 03 20:12:29.152881 master-0 kubenswrapper[29252]: I1203 20:12:29.151888 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 20:12:29.201629 master-0 kubenswrapper[29252]: I1203 20:12:29.201449 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 20:12:29.305545 master-0 kubenswrapper[29252]: I1203 20:12:29.305320 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 20:12:29.422129 master-0 kubenswrapper[29252]: I1203 20:12:29.422087 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 20:12:29.471493 master-0 kubenswrapper[29252]: I1203 20:12:29.471407 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:12:29.601298 master-0 kubenswrapper[29252]: I1203 20:12:29.601135 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 20:12:29.620722 master-0 kubenswrapper[29252]: I1203 20:12:29.620640 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 20:12:30.103292 master-0 kubenswrapper[29252]: I1203 20:12:30.103193 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 20:12:30.170842 master-0 kubenswrapper[29252]: I1203 20:12:30.170757 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 20:12:30.217949 master-0 kubenswrapper[29252]: I1203 20:12:30.217904 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 20:12:30.276943 master-0 kubenswrapper[29252]: I1203 20:12:30.276880 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 20:12:30.405706 master-0 kubenswrapper[29252]: I1203 20:12:30.405638 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 20:12:30.606004 master-0 kubenswrapper[29252]: I1203 20:12:30.605920 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 20:12:30.678837 master-0 kubenswrapper[29252]: I1203 20:12:30.678699 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 20:12:30.708981 master-0 kubenswrapper[29252]: I1203 20:12:30.708920 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 20:12:30.774740 master-0 kubenswrapper[29252]: I1203 20:12:30.774649 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 20:12:30.934077 master-0 kubenswrapper[29252]: I1203 20:12:30.933746 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 20:12:31.230558 master-0 kubenswrapper[29252]: I1203 20:12:31.230432 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 20:12:31.415998 master-0 kubenswrapper[29252]: I1203 20:12:31.415927 29252 scope.go:117] "RemoveContainer" containerID="8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8" Dec 03 20:12:31.416182 master-0 kubenswrapper[29252]: E1203 20:12:31.416144 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-59d99f9b7b-h64kt_openshift-insights(af2023e1-9c7a-40af-a6bf-fba31c3565b1)\"" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" podUID="af2023e1-9c7a-40af-a6bf-fba31c3565b1" Dec 03 20:12:31.653877 master-0 kubenswrapper[29252]: I1203 20:12:31.653792 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 20:12:33.844554 master-0 kubenswrapper[29252]: I1203 20:12:33.844489 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 20:12:37.975595 master-0 kubenswrapper[29252]: I1203 20:12:37.975517 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:12:37.976181 master-0 kubenswrapper[29252]: I1203 20:12:37.975923 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="cbad610cd8689b0972c02840bb486a62" containerName="startup-monitor" containerID="cri-o://49f0b324bc2753e4944ad48b25a1c7c849e6e6354efcd8b10464f50e49e0bdc5" gracePeriod=5 Dec 03 20:12:43.340842 master-0 kubenswrapper[29252]: I1203 20:12:43.340486 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_cbad610cd8689b0972c02840bb486a62/startup-monitor/0.log" Dec 03 20:12:43.340842 master-0 kubenswrapper[29252]: I1203 20:12:43.340592 29252 generic.go:334] "Generic (PLEG): container finished" podID="cbad610cd8689b0972c02840bb486a62" containerID="49f0b324bc2753e4944ad48b25a1c7c849e6e6354efcd8b10464f50e49e0bdc5" exitCode=137 Dec 03 20:12:43.576634 master-0 kubenswrapper[29252]: I1203 20:12:43.576548 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_cbad610cd8689b0972c02840bb486a62/startup-monitor/0.log" Dec 03 20:12:43.577049 master-0 kubenswrapper[29252]: I1203 20:12:43.576668 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:12:43.711373 master-0 kubenswrapper[29252]: I1203 20:12:43.711310 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir\") pod \"cbad610cd8689b0972c02840bb486a62\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " Dec 03 20:12:43.711373 master-0 kubenswrapper[29252]: I1203 20:12:43.711350 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir\") pod \"cbad610cd8689b0972c02840bb486a62\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " Dec 03 20:12:43.711373 master-0 kubenswrapper[29252]: I1203 20:12:43.711367 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log\") pod \"cbad610cd8689b0972c02840bb486a62\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711426 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests\") pod \"cbad610cd8689b0972c02840bb486a62\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711435 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cbad610cd8689b0972c02840bb486a62" (UID: "cbad610cd8689b0972c02840bb486a62"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711449 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock\") pod \"cbad610cd8689b0972c02840bb486a62\" (UID: \"cbad610cd8689b0972c02840bb486a62\") " Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711471 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock" (OuterVolumeSpecName: "var-lock") pod "cbad610cd8689b0972c02840bb486a62" (UID: "cbad610cd8689b0972c02840bb486a62"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711492 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log" (OuterVolumeSpecName: "var-log") pod "cbad610cd8689b0972c02840bb486a62" (UID: "cbad610cd8689b0972c02840bb486a62"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:12:43.711654 master-0 kubenswrapper[29252]: I1203 20:12:43.711540 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests" (OuterVolumeSpecName: "manifests") pod "cbad610cd8689b0972c02840bb486a62" (UID: "cbad610cd8689b0972c02840bb486a62"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:12:43.711897 master-0 kubenswrapper[29252]: I1203 20:12:43.711734 29252 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-manifests\") on node \"master-0\" DevicePath \"\"" Dec 03 20:12:43.711897 master-0 kubenswrapper[29252]: I1203 20:12:43.711757 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:12:43.711897 master-0 kubenswrapper[29252]: I1203 20:12:43.711772 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:12:43.711897 master-0 kubenswrapper[29252]: I1203 20:12:43.711825 29252 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-var-log\") on node \"master-0\" DevicePath \"\"" Dec 03 20:12:43.717168 master-0 kubenswrapper[29252]: I1203 20:12:43.717133 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "cbad610cd8689b0972c02840bb486a62" (UID: "cbad610cd8689b0972c02840bb486a62"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:12:43.813347 master-0 kubenswrapper[29252]: I1203 20:12:43.813216 29252 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/cbad610cd8689b0972c02840bb486a62-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:12:44.353142 master-0 kubenswrapper[29252]: I1203 20:12:44.353047 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_cbad610cd8689b0972c02840bb486a62/startup-monitor/0.log" Dec 03 20:12:44.354073 master-0 kubenswrapper[29252]: I1203 20:12:44.353191 29252 scope.go:117] "RemoveContainer" containerID="49f0b324bc2753e4944ad48b25a1c7c849e6e6354efcd8b10464f50e49e0bdc5" Dec 03 20:12:44.354073 master-0 kubenswrapper[29252]: I1203 20:12:44.353298 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:12:45.424324 master-0 kubenswrapper[29252]: I1203 20:12:45.424204 29252 scope.go:117] "RemoveContainer" containerID="8618784cb8277c94258ed2a8dfd91f20d4b788eab3ceb47826a93998b38bfea8" Dec 03 20:12:45.426956 master-0 kubenswrapper[29252]: I1203 20:12:45.426887 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbad610cd8689b0972c02840bb486a62" path="/var/lib/kubelet/pods/cbad610cd8689b0972c02840bb486a62/volumes" Dec 03 20:12:46.009675 master-0 kubenswrapper[29252]: I1203 20:12:46.009533 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:12:46.009675 master-0 kubenswrapper[29252]: I1203 20:12:46.009638 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:12:46.372565 master-0 kubenswrapper[29252]: I1203 20:12:46.372488 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-h64kt" event={"ID":"af2023e1-9c7a-40af-a6bf-fba31c3565b1","Type":"ContainerStarted","Data":"37128b417bae297116bfadf468dce659a425f2c1e829e65f75d20e53b995f218"} Dec 03 20:13:16.009701 master-0 kubenswrapper[29252]: I1203 20:13:16.009606 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:13:16.009701 master-0 kubenswrapper[29252]: I1203 20:13:16.009691 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:13:46.008757 master-0 kubenswrapper[29252]: I1203 20:13:46.008663 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:13:46.008757 master-0 kubenswrapper[29252]: I1203 20:13:46.008739 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:13:46.010251 master-0 kubenswrapper[29252]: I1203 20:13:46.008829 29252 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" Dec 03 20:13:46.010251 master-0 kubenswrapper[29252]: I1203 20:13:46.009484 29252 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4056ba5f9d6e5d99d04379037a16b03e163ddc085064bce80ce77bdbefd30aec"} pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 20:13:46.010251 master-0 kubenswrapper[29252]: I1203 20:13:46.009563 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" containerID="cri-o://4056ba5f9d6e5d99d04379037a16b03e163ddc085064bce80ce77bdbefd30aec" gracePeriod=600 Dec 03 20:13:46.813962 master-0 kubenswrapper[29252]: I1203 20:13:46.813883 29252 generic.go:334] "Generic (PLEG): container finished" podID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerID="4056ba5f9d6e5d99d04379037a16b03e163ddc085064bce80ce77bdbefd30aec" exitCode=0 Dec 03 20:13:46.813962 master-0 kubenswrapper[29252]: I1203 20:13:46.813951 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerDied","Data":"4056ba5f9d6e5d99d04379037a16b03e163ddc085064bce80ce77bdbefd30aec"} Dec 03 20:13:46.814303 master-0 kubenswrapper[29252]: I1203 20:13:46.813991 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" event={"ID":"9891cf64-59e8-4d8d-94fe-17cfa4b18c1b","Type":"ContainerStarted","Data":"ed357f1678f18382e8bb9d0d37fd24412505e1dd6d4ba6b40528bcaa6ea31b93"} Dec 03 20:13:46.814303 master-0 kubenswrapper[29252]: I1203 20:13:46.814020 29252 scope.go:117] "RemoveContainer" containerID="07b6cf5187d73a5f60790f2be4d7efe702428be4f1b035394f75ae9cb9fd2d4a" Dec 03 20:14:20.278225 master-0 kubenswrapper[29252]: I1203 20:14:20.278168 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44"] Dec 03 20:14:20.278715 master-0 kubenswrapper[29252]: E1203 20:14:20.278455 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb85302-c965-417f-8c35-9aff2e464281" containerName="installer" Dec 03 20:14:20.278715 master-0 kubenswrapper[29252]: I1203 20:14:20.278476 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb85302-c965-417f-8c35-9aff2e464281" containerName="installer" Dec 03 20:14:20.278715 master-0 kubenswrapper[29252]: E1203 20:14:20.278489 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbad610cd8689b0972c02840bb486a62" containerName="startup-monitor" Dec 03 20:14:20.278715 master-0 kubenswrapper[29252]: I1203 20:14:20.278501 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbad610cd8689b0972c02840bb486a62" containerName="startup-monitor" Dec 03 20:14:20.278715 master-0 kubenswrapper[29252]: I1203 20:14:20.278698 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb85302-c965-417f-8c35-9aff2e464281" containerName="installer" Dec 03 20:14:20.278909 master-0 kubenswrapper[29252]: I1203 20:14:20.278723 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbad610cd8689b0972c02840bb486a62" containerName="startup-monitor" Dec 03 20:14:20.279956 master-0 kubenswrapper[29252]: I1203 20:14:20.279927 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.293036 master-0 kubenswrapper[29252]: I1203 20:14:20.292966 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 20:14:20.296965 master-0 kubenswrapper[29252]: I1203 20:14:20.296933 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-482c4" Dec 03 20:14:20.299350 master-0 kubenswrapper[29252]: I1203 20:14:20.299301 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44"] Dec 03 20:14:20.403204 master-0 kubenswrapper[29252]: I1203 20:14:20.403147 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.403404 master-0 kubenswrapper[29252]: I1203 20:14:20.403211 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjjxw\" (UniqueName: \"kubernetes.io/projected/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-kube-api-access-vjjxw\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.403404 master-0 kubenswrapper[29252]: I1203 20:14:20.403245 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.504205 master-0 kubenswrapper[29252]: I1203 20:14:20.504124 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.504205 master-0 kubenswrapper[29252]: I1203 20:14:20.504188 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjjxw\" (UniqueName: \"kubernetes.io/projected/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-kube-api-access-vjjxw\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.504548 master-0 kubenswrapper[29252]: I1203 20:14:20.504231 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.505195 master-0 kubenswrapper[29252]: I1203 20:14:20.505105 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.509702 master-0 kubenswrapper[29252]: I1203 20:14:20.509645 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.523209 master-0 kubenswrapper[29252]: I1203 20:14:20.523142 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjjxw\" (UniqueName: \"kubernetes.io/projected/c8d5c43d-c3b7-4f15-bd32-fd950c172c89-kube-api-access-vjjxw\") pod \"machine-config-controller-74cddd4fb5-s5r44\" (UID: \"c8d5c43d-c3b7-4f15-bd32-fd950c172c89\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:20.655444 master-0 kubenswrapper[29252]: I1203 20:14:20.655379 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" Dec 03 20:14:21.163030 master-0 kubenswrapper[29252]: I1203 20:14:21.162987 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44"] Dec 03 20:14:21.167031 master-0 kubenswrapper[29252]: W1203 20:14:21.166985 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d5c43d_c3b7_4f15_bd32_fd950c172c89.slice/crio-ec9db8603a6e1a5d37ae529c9c83418d84150f6e73166c920850320e653546bb WatchSource:0}: Error finding container ec9db8603a6e1a5d37ae529c9c83418d84150f6e73166c920850320e653546bb: Status 404 returned error can't find the container with id ec9db8603a6e1a5d37ae529c9c83418d84150f6e73166c920850320e653546bb Dec 03 20:14:21.444812 master-0 kubenswrapper[29252]: I1203 20:14:21.441667 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx"] Dec 03 20:14:21.444812 master-0 kubenswrapper[29252]: I1203 20:14:21.442316 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:21.445480 master-0 kubenswrapper[29252]: I1203 20:14:21.445106 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 20:14:21.463879 master-0 kubenswrapper[29252]: I1203 20:14:21.459975 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54f97f57-l5rb2"] Dec 03 20:14:21.463879 master-0 kubenswrapper[29252]: I1203 20:14:21.460866 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.463879 master-0 kubenswrapper[29252]: I1203 20:14:21.463815 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm"] Dec 03 20:14:21.468642 master-0 kubenswrapper[29252]: I1203 20:14:21.464544 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469324 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469520 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469620 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469725 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469841 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.469950 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.470061 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7"] Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.470104 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 20:14:21.472851 master-0 kubenswrapper[29252]: I1203 20:14:21.470763 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" Dec 03 20:14:21.476179 master-0 kubenswrapper[29252]: I1203 20:14:21.475996 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx"] Dec 03 20:14:21.479203 master-0 kubenswrapper[29252]: I1203 20:14:21.478177 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm"] Dec 03 20:14:21.505346 master-0 kubenswrapper[29252]: I1203 20:14:21.505288 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7"] Dec 03 20:14:21.510342 master-0 kubenswrapper[29252]: I1203 20:14:21.509248 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-77df56447c-vnp7f"] Dec 03 20:14:21.510342 master-0 kubenswrapper[29252]: I1203 20:14:21.509803 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.515026 master-0 kubenswrapper[29252]: I1203 20:14:21.514986 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-6kl7k" Dec 03 20:14:21.515453 master-0 kubenswrapper[29252]: I1203 20:14:21.515407 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 20:14:21.515709 master-0 kubenswrapper[29252]: I1203 20:14:21.515686 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 20:14:21.515969 master-0 kubenswrapper[29252]: I1203 20:14:21.515946 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 20:14:21.516928 master-0 kubenswrapper[29252]: I1203 20:14:21.516901 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 20:14:21.521591 master-0 kubenswrapper[29252]: I1203 20:14:21.521549 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85v9\" (UniqueName: \"kubernetes.io/projected/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-kube-api-access-s85v9\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.521591 master-0 kubenswrapper[29252]: I1203 20:14:21.521604 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-metrics-certs\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.521862 master-0 kubenswrapper[29252]: I1203 20:14:21.521628 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-default-certificate\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.521862 master-0 kubenswrapper[29252]: I1203 20:14:21.521657 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-service-ca-bundle\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.521862 master-0 kubenswrapper[29252]: I1203 20:14:21.521676 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-stats-auth\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.521862 master-0 kubenswrapper[29252]: I1203 20:14:21.521692 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d86deda-4cd7-4ed5-a703-31f644e2947d-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx\" (UID: \"8d86deda-4cd7-4ed5-a703-31f644e2947d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:21.523376 master-0 kubenswrapper[29252]: I1203 20:14:21.523357 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 20:14:21.525679 master-0 kubenswrapper[29252]: I1203 20:14:21.525634 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77df56447c-vnp7f"] Dec 03 20:14:21.584826 master-0 kubenswrapper[29252]: I1203 20:14:21.584757 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rj2f6"] Dec 03 20:14:21.585800 master-0 kubenswrapper[29252]: I1203 20:14:21.585480 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.587900 master-0 kubenswrapper[29252]: I1203 20:14:21.587690 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-5f72m" Dec 03 20:14:21.588129 master-0 kubenswrapper[29252]: I1203 20:14:21.587909 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 20:14:21.588129 master-0 kubenswrapper[29252]: I1203 20:14:21.588116 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 20:14:21.588222 master-0 kubenswrapper[29252]: I1203 20:14:21.588157 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 20:14:21.596800 master-0 kubenswrapper[29252]: I1203 20:14:21.596247 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rj2f6"] Dec 03 20:14:21.622534 master-0 kubenswrapper[29252]: I1203 20:14:21.622484 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.622534 master-0 kubenswrapper[29252]: I1203 20:14:21.622528 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-metrics-certs\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.622772 master-0 kubenswrapper[29252]: I1203 20:14:21.622554 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-config\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.622772 master-0 kubenswrapper[29252]: I1203 20:14:21.622574 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-default-certificate\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.622772 master-0 kubenswrapper[29252]: I1203 20:14:21.622593 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cb56\" (UniqueName: \"kubernetes.io/projected/1d1224ac-cc8d-4c7e-ad30-971f698221b7-kube-api-access-4cb56\") pod \"network-check-source-6964bb78b7-jhfn7\" (UID: \"1d1224ac-cc8d-4c7e-ad30-971f698221b7\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" Dec 03 20:14:21.622772 master-0 kubenswrapper[29252]: I1203 20:14:21.622694 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-service-ca-bundle\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.622772 master-0 kubenswrapper[29252]: I1203 20:14:21.622742 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-trusted-ca\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.623010 master-0 kubenswrapper[29252]: I1203 20:14:21.622796 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-stats-auth\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.623010 master-0 kubenswrapper[29252]: I1203 20:14:21.622819 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d86deda-4cd7-4ed5-a703-31f644e2947d-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx\" (UID: \"8d86deda-4cd7-4ed5-a703-31f644e2947d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:21.623010 master-0 kubenswrapper[29252]: I1203 20:14:21.622874 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vsp\" (UniqueName: \"kubernetes.io/projected/655df528-d475-4908-ba38-a5a646744484-kube-api-access-w2vsp\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.623010 master-0 kubenswrapper[29252]: I1203 20:14:21.622946 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.623010 master-0 kubenswrapper[29252]: I1203 20:14:21.622965 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85v9\" (UniqueName: \"kubernetes.io/projected/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-kube-api-access-s85v9\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.623286 master-0 kubenswrapper[29252]: I1203 20:14:21.623016 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.623286 master-0 kubenswrapper[29252]: I1203 20:14:21.623041 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655df528-d475-4908-ba38-a5a646744484-serving-cert\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.625750 master-0 kubenswrapper[29252]: I1203 20:14:21.625536 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-service-ca-bundle\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.626628 master-0 kubenswrapper[29252]: I1203 20:14:21.626595 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-default-certificate\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.627349 master-0 kubenswrapper[29252]: I1203 20:14:21.627161 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8d86deda-4cd7-4ed5-a703-31f644e2947d-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx\" (UID: \"8d86deda-4cd7-4ed5-a703-31f644e2947d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:21.627349 master-0 kubenswrapper[29252]: I1203 20:14:21.627179 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-stats-auth\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.627349 master-0 kubenswrapper[29252]: I1203 20:14:21.627276 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-metrics-certs\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.638728 master-0 kubenswrapper[29252]: I1203 20:14:21.638686 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85v9\" (UniqueName: \"kubernetes.io/projected/a4f90ab0-b480-44a2-b87d-220ab6bba9c5-kube-api-access-s85v9\") pod \"router-default-54f97f57-l5rb2\" (UID: \"a4f90ab0-b480-44a2-b87d-220ab6bba9c5\") " pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.724242 master-0 kubenswrapper[29252]: I1203 20:14:21.724098 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8602809a-528e-4f1a-9157-c45f0da4b768-cert\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.724242 master-0 kubenswrapper[29252]: I1203 20:14:21.724157 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-trusted-ca\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.724242 master-0 kubenswrapper[29252]: I1203 20:14:21.724207 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vsp\" (UniqueName: \"kubernetes.io/projected/655df528-d475-4908-ba38-a5a646744484-kube-api-access-w2vsp\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.724242 master-0 kubenswrapper[29252]: I1203 20:14:21.724241 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724264 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724283 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqn6\" (UniqueName: \"kubernetes.io/projected/8602809a-528e-4f1a-9157-c45f0da4b768-kube-api-access-xcqn6\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724299 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655df528-d475-4908-ba38-a5a646744484-serving-cert\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724321 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724343 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-config\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.724627 master-0 kubenswrapper[29252]: I1203 20:14:21.724362 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cb56\" (UniqueName: \"kubernetes.io/projected/1d1224ac-cc8d-4c7e-ad30-971f698221b7-kube-api-access-4cb56\") pod \"network-check-source-6964bb78b7-jhfn7\" (UID: \"1d1224ac-cc8d-4c7e-ad30-971f698221b7\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" Dec 03 20:14:21.725371 master-0 kubenswrapper[29252]: I1203 20:14:21.725313 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-config\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.725696 master-0 kubenswrapper[29252]: I1203 20:14:21.725652 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/655df528-d475-4908-ba38-a5a646744484-trusted-ca\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.725825 master-0 kubenswrapper[29252]: I1203 20:14:21.725709 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.727896 master-0 kubenswrapper[29252]: I1203 20:14:21.727483 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655df528-d475-4908-ba38-a5a646744484-serving-cert\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.731198 master-0 kubenswrapper[29252]: I1203 20:14:21.728961 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.743065 master-0 kubenswrapper[29252]: I1203 20:14:21.743023 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vsp\" (UniqueName: \"kubernetes.io/projected/655df528-d475-4908-ba38-a5a646744484-kube-api-access-w2vsp\") pod \"console-operator-77df56447c-vnp7f\" (UID: \"655df528-d475-4908-ba38-a5a646744484\") " pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.744617 master-0 kubenswrapper[29252]: I1203 20:14:21.744585 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr\") pod \"collect-profiles-29413200-7vcfm\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.745159 master-0 kubenswrapper[29252]: I1203 20:14:21.745127 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cb56\" (UniqueName: \"kubernetes.io/projected/1d1224ac-cc8d-4c7e-ad30-971f698221b7-kube-api-access-4cb56\") pod \"network-check-source-6964bb78b7-jhfn7\" (UID: \"1d1224ac-cc8d-4c7e-ad30-971f698221b7\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" Dec 03 20:14:21.776415 master-0 kubenswrapper[29252]: I1203 20:14:21.776369 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:21.808612 master-0 kubenswrapper[29252]: I1203 20:14:21.808546 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:21.826184 master-0 kubenswrapper[29252]: I1203 20:14:21.826142 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8602809a-528e-4f1a-9157-c45f0da4b768-cert\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.826322 master-0 kubenswrapper[29252]: I1203 20:14:21.826220 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqn6\" (UniqueName: \"kubernetes.io/projected/8602809a-528e-4f1a-9157-c45f0da4b768-kube-api-access-xcqn6\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.833149 master-0 kubenswrapper[29252]: I1203 20:14:21.833100 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8602809a-528e-4f1a-9157-c45f0da4b768-cert\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.836614 master-0 kubenswrapper[29252]: I1203 20:14:21.836581 29252 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 20:14:21.843194 master-0 kubenswrapper[29252]: I1203 20:14:21.843017 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqn6\" (UniqueName: \"kubernetes.io/projected/8602809a-528e-4f1a-9157-c45f0da4b768-kube-api-access-xcqn6\") pod \"ingress-canary-rj2f6\" (UID: \"8602809a-528e-4f1a-9157-c45f0da4b768\") " pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:21.849921 master-0 kubenswrapper[29252]: I1203 20:14:21.849858 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:21.877438 master-0 kubenswrapper[29252]: I1203 20:14:21.875440 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" Dec 03 20:14:21.891304 master-0 kubenswrapper[29252]: I1203 20:14:21.890841 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:21.913564 master-0 kubenswrapper[29252]: I1203 20:14:21.913191 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rj2f6" Dec 03 20:14:22.079965 master-0 kubenswrapper[29252]: I1203 20:14:22.079909 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" event={"ID":"c8d5c43d-c3b7-4f15-bd32-fd950c172c89","Type":"ContainerStarted","Data":"6c5c4a84f727cc733e8ae8bd34df01a6d86e566bfb1f88c112255508be6b6e1c"} Dec 03 20:14:22.079965 master-0 kubenswrapper[29252]: I1203 20:14:22.079962 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" event={"ID":"c8d5c43d-c3b7-4f15-bd32-fd950c172c89","Type":"ContainerStarted","Data":"7526a8fb28cd9eea68811d742b66c7cf5881cb32016294380da119ff0947944c"} Dec 03 20:14:22.079965 master-0 kubenswrapper[29252]: I1203 20:14:22.079976 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" event={"ID":"c8d5c43d-c3b7-4f15-bd32-fd950c172c89","Type":"ContainerStarted","Data":"ec9db8603a6e1a5d37ae529c9c83418d84150f6e73166c920850320e653546bb"} Dec 03 20:14:22.080933 master-0 kubenswrapper[29252]: I1203 20:14:22.080901 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-l5rb2" event={"ID":"a4f90ab0-b480-44a2-b87d-220ab6bba9c5","Type":"ContainerStarted","Data":"e215d2a49fe6ff63c8d42b83b04590a22faea1d0a7df659b7a79e8341ea134c8"} Dec 03 20:14:22.101191 master-0 kubenswrapper[29252]: I1203 20:14:22.099924 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-s5r44" podStartSLOduration=2.099903407 podStartE2EDuration="2.099903407s" podCreationTimestamp="2025-12-03 20:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:22.099017366 +0000 UTC m=+296.912562359" watchObservedRunningTime="2025-12-03 20:14:22.099903407 +0000 UTC m=+296.913448370" Dec 03 20:14:22.176848 master-0 kubenswrapper[29252]: W1203 20:14:22.176801 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d86deda_4cd7_4ed5_a703_31f644e2947d.slice/crio-81239f7451d93ffaafea063437497ee52400df3ab27e2593e75613ef76448af1 WatchSource:0}: Error finding container 81239f7451d93ffaafea063437497ee52400df3ab27e2593e75613ef76448af1: Status 404 returned error can't find the container with id 81239f7451d93ffaafea063437497ee52400df3ab27e2593e75613ef76448af1 Dec 03 20:14:22.178882 master-0 kubenswrapper[29252]: I1203 20:14:22.178835 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx"] Dec 03 20:14:22.322588 master-0 kubenswrapper[29252]: I1203 20:14:22.322546 29252 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 20:14:22.366934 master-0 kubenswrapper[29252]: I1203 20:14:22.364484 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm"] Dec 03 20:14:22.375374 master-0 kubenswrapper[29252]: W1203 20:14:22.374137 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a9f48d_a9e4_4949_a094_11b5a1d07c2e.slice/crio-c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f WatchSource:0}: Error finding container c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f: Status 404 returned error can't find the container with id c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f Dec 03 20:14:22.396323 master-0 kubenswrapper[29252]: I1203 20:14:22.396123 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7"] Dec 03 20:14:22.403258 master-0 kubenswrapper[29252]: W1203 20:14:22.403207 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d1224ac_cc8d_4c7e_ad30_971f698221b7.slice/crio-0780333dee799ef5330775d97a1a1cbbc7641261afb5a1122239b278f7519faf WatchSource:0}: Error finding container 0780333dee799ef5330775d97a1a1cbbc7641261afb5a1122239b278f7519faf: Status 404 returned error can't find the container with id 0780333dee799ef5330775d97a1a1cbbc7641261afb5a1122239b278f7519faf Dec 03 20:14:22.453800 master-0 kubenswrapper[29252]: I1203 20:14:22.449431 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rj2f6"] Dec 03 20:14:22.460636 master-0 kubenswrapper[29252]: I1203 20:14:22.460567 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77df56447c-vnp7f"] Dec 03 20:14:23.087729 master-0 kubenswrapper[29252]: I1203 20:14:23.087664 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" event={"ID":"8d86deda-4cd7-4ed5-a703-31f644e2947d","Type":"ContainerStarted","Data":"81239f7451d93ffaafea063437497ee52400df3ab27e2593e75613ef76448af1"} Dec 03 20:14:23.089582 master-0 kubenswrapper[29252]: I1203 20:14:23.089554 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rj2f6" event={"ID":"8602809a-528e-4f1a-9157-c45f0da4b768","Type":"ContainerStarted","Data":"06f92c544c711b5a52f571a8208aba61f74f28abef7836fee955dff4c810df03"} Dec 03 20:14:23.089582 master-0 kubenswrapper[29252]: I1203 20:14:23.089582 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rj2f6" event={"ID":"8602809a-528e-4f1a-9157-c45f0da4b768","Type":"ContainerStarted","Data":"8da3b853b6e8b10bf1bd8f93e3ed3dc56dec92a6ce515afa2ee5966d58d20729"} Dec 03 20:14:23.091682 master-0 kubenswrapper[29252]: I1203 20:14:23.091654 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" event={"ID":"06a9f48d-a9e4-4949-a094-11b5a1d07c2e","Type":"ContainerStarted","Data":"c8229ff974ee812b63430d5fb83bc853c30a452802712e04b5aeb74a776fc9f9"} Dec 03 20:14:23.091796 master-0 kubenswrapper[29252]: I1203 20:14:23.091685 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" event={"ID":"06a9f48d-a9e4-4949-a094-11b5a1d07c2e","Type":"ContainerStarted","Data":"c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f"} Dec 03 20:14:23.093060 master-0 kubenswrapper[29252]: I1203 20:14:23.093021 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" event={"ID":"655df528-d475-4908-ba38-a5a646744484","Type":"ContainerStarted","Data":"5c2bb05e497d1e30e6c910fd2caa84d39b39dc2abfd035a8e63eee12201f3128"} Dec 03 20:14:23.095368 master-0 kubenswrapper[29252]: I1203 20:14:23.095344 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" event={"ID":"1d1224ac-cc8d-4c7e-ad30-971f698221b7","Type":"ContainerStarted","Data":"69226650af379182ef80036b4ab381f3dea7f4c00b8ed499503c125136edd629"} Dec 03 20:14:23.095368 master-0 kubenswrapper[29252]: I1203 20:14:23.095368 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" event={"ID":"1d1224ac-cc8d-4c7e-ad30-971f698221b7","Type":"ContainerStarted","Data":"0780333dee799ef5330775d97a1a1cbbc7641261afb5a1122239b278f7519faf"} Dec 03 20:14:23.114809 master-0 kubenswrapper[29252]: I1203 20:14:23.113791 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rj2f6" podStartSLOduration=2.113760486 podStartE2EDuration="2.113760486s" podCreationTimestamp="2025-12-03 20:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:23.10566256 +0000 UTC m=+297.919207523" watchObservedRunningTime="2025-12-03 20:14:23.113760486 +0000 UTC m=+297.927305439" Dec 03 20:14:23.149804 master-0 kubenswrapper[29252]: I1203 20:14:23.149533 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" podStartSLOduration=393.149514343 podStartE2EDuration="6m33.149514343s" podCreationTimestamp="2025-12-03 20:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:23.131323792 +0000 UTC m=+297.944868755" watchObservedRunningTime="2025-12-03 20:14:23.149514343 +0000 UTC m=+297.963059296" Dec 03 20:14:23.153010 master-0 kubenswrapper[29252]: I1203 20:14:23.151249 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-jhfn7" podStartSLOduration=1201.151240254 podStartE2EDuration="20m1.151240254s" podCreationTimestamp="2025-12-03 19:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:23.147232667 +0000 UTC m=+297.960777630" watchObservedRunningTime="2025-12-03 20:14:23.151240254 +0000 UTC m=+297.964785207" Dec 03 20:14:24.102341 master-0 kubenswrapper[29252]: I1203 20:14:24.102291 29252 generic.go:334] "Generic (PLEG): container finished" podID="06a9f48d-a9e4-4949-a094-11b5a1d07c2e" containerID="c8229ff974ee812b63430d5fb83bc853c30a452802712e04b5aeb74a776fc9f9" exitCode=0 Dec 03 20:14:24.102912 master-0 kubenswrapper[29252]: I1203 20:14:24.102474 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" event={"ID":"06a9f48d-a9e4-4949-a094-11b5a1d07c2e","Type":"ContainerDied","Data":"c8229ff974ee812b63430d5fb83bc853c30a452802712e04b5aeb74a776fc9f9"} Dec 03 20:14:24.735537 master-0 kubenswrapper[29252]: I1203 20:14:24.735474 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hzx8j"] Dec 03 20:14:24.736236 master-0 kubenswrapper[29252]: I1203 20:14:24.736197 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.738884 master-0 kubenswrapper[29252]: I1203 20:14:24.737673 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-7m7l6" Dec 03 20:14:24.738884 master-0 kubenswrapper[29252]: I1203 20:14:24.738115 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 20:14:24.738884 master-0 kubenswrapper[29252]: I1203 20:14:24.738701 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 20:14:24.777442 master-0 kubenswrapper[29252]: I1203 20:14:24.777279 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-certs\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.777442 master-0 kubenswrapper[29252]: I1203 20:14:24.777353 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7vbz\" (UniqueName: \"kubernetes.io/projected/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-kube-api-access-r7vbz\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.777442 master-0 kubenswrapper[29252]: I1203 20:14:24.777409 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-node-bootstrap-token\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.878475 master-0 kubenswrapper[29252]: I1203 20:14:24.878430 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-node-bootstrap-token\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.878603 master-0 kubenswrapper[29252]: I1203 20:14:24.878527 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-certs\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.878603 master-0 kubenswrapper[29252]: I1203 20:14:24.878570 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7vbz\" (UniqueName: \"kubernetes.io/projected/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-kube-api-access-r7vbz\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.882144 master-0 kubenswrapper[29252]: I1203 20:14:24.882112 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-certs\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.883625 master-0 kubenswrapper[29252]: I1203 20:14:24.883592 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-node-bootstrap-token\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:24.896508 master-0 kubenswrapper[29252]: I1203 20:14:24.896466 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7vbz\" (UniqueName: \"kubernetes.io/projected/e394bb77-fda1-491c-b4c1-c4a2a0f35a32-kube-api-access-r7vbz\") pod \"machine-config-server-hzx8j\" (UID: \"e394bb77-fda1-491c-b4c1-c4a2a0f35a32\") " pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:25.057632 master-0 kubenswrapper[29252]: I1203 20:14:25.057556 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hzx8j" Dec 03 20:14:25.084070 master-0 kubenswrapper[29252]: W1203 20:14:25.083981 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode394bb77_fda1_491c_b4c1_c4a2a0f35a32.slice/crio-f6c7b726d0e514835610c4502a44edf156b6c9ec67efa7abcdaa9acfc6cd7157 WatchSource:0}: Error finding container f6c7b726d0e514835610c4502a44edf156b6c9ec67efa7abcdaa9acfc6cd7157: Status 404 returned error can't find the container with id f6c7b726d0e514835610c4502a44edf156b6c9ec67efa7abcdaa9acfc6cd7157 Dec 03 20:14:25.120596 master-0 kubenswrapper[29252]: I1203 20:14:25.120548 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" event={"ID":"8d86deda-4cd7-4ed5-a703-31f644e2947d","Type":"ContainerStarted","Data":"8a4bae9595b0d5375bd258ba574cd4734851d8c98b5c87398366ce0513aaacaa"} Dec 03 20:14:25.121101 master-0 kubenswrapper[29252]: I1203 20:14:25.120887 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:25.124633 master-0 kubenswrapper[29252]: I1203 20:14:25.123264 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-l5rb2" event={"ID":"a4f90ab0-b480-44a2-b87d-220ab6bba9c5","Type":"ContainerStarted","Data":"801a0e208220ad89f9915c9b38a69c860efb3a7e438c4b152a85dd4520cc9a46"} Dec 03 20:14:25.126412 master-0 kubenswrapper[29252]: I1203 20:14:25.125773 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hzx8j" event={"ID":"e394bb77-fda1-491c-b4c1-c4a2a0f35a32","Type":"ContainerStarted","Data":"f6c7b726d0e514835610c4502a44edf156b6c9ec67efa7abcdaa9acfc6cd7157"} Dec 03 20:14:25.126412 master-0 kubenswrapper[29252]: I1203 20:14:25.126090 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" Dec 03 20:14:25.127771 master-0 kubenswrapper[29252]: I1203 20:14:25.127708 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" event={"ID":"655df528-d475-4908-ba38-a5a646744484","Type":"ContainerStarted","Data":"a6a79024372d3e1adcb6b16ef57338532a004cacb4ed2fdeff80eaae9d9413e5"} Dec 03 20:14:25.128020 master-0 kubenswrapper[29252]: I1203 20:14:25.127958 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:25.129580 master-0 kubenswrapper[29252]: I1203 20:14:25.129292 29252 patch_prober.go:28] interesting pod/console-operator-77df56447c-vnp7f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.128.0.86:8443/readyz\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Dec 03 20:14:25.129580 master-0 kubenswrapper[29252]: I1203 20:14:25.129335 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" podUID="655df528-d475-4908-ba38-a5a646744484" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.86:8443/readyz\": dial tcp 10.128.0.86:8443: connect: connection refused" Dec 03 20:14:25.155754 master-0 kubenswrapper[29252]: I1203 20:14:25.155664 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx" podStartSLOduration=1106.522144718 podStartE2EDuration="18m29.155641006s" podCreationTimestamp="2025-12-03 19:55:56 +0000 UTC" firstStartedPulling="2025-12-03 20:14:22.179541969 +0000 UTC m=+296.993086922" lastFinishedPulling="2025-12-03 20:14:24.813038257 +0000 UTC m=+299.626583210" observedRunningTime="2025-12-03 20:14:25.153886164 +0000 UTC m=+299.967431137" watchObservedRunningTime="2025-12-03 20:14:25.155641006 +0000 UTC m=+299.969185989" Dec 03 20:14:25.179266 master-0 kubenswrapper[29252]: I1203 20:14:25.179200 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" podStartSLOduration=1.797622938 podStartE2EDuration="4.179180287s" podCreationTimestamp="2025-12-03 20:14:21 +0000 UTC" firstStartedPulling="2025-12-03 20:14:22.461963598 +0000 UTC m=+297.275508551" lastFinishedPulling="2025-12-03 20:14:24.843520957 +0000 UTC m=+299.657065900" observedRunningTime="2025-12-03 20:14:25.175120689 +0000 UTC m=+299.988665642" watchObservedRunningTime="2025-12-03 20:14:25.179180287 +0000 UTC m=+299.992725230" Dec 03 20:14:25.525476 master-0 kubenswrapper[29252]: I1203 20:14:25.525347 29252 kubelet.go:1505] "Image garbage collection succeeded" Dec 03 20:14:25.551580 master-0 kubenswrapper[29252]: I1203 20:14:25.551519 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:25.576198 master-0 kubenswrapper[29252]: I1203 20:14:25.576075 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54f97f57-l5rb2" podStartSLOduration=1111.597275399 podStartE2EDuration="18m34.576050232s" podCreationTimestamp="2025-12-03 19:55:51 +0000 UTC" firstStartedPulling="2025-12-03 20:14:21.83654899 +0000 UTC m=+296.650093943" lastFinishedPulling="2025-12-03 20:14:24.815323823 +0000 UTC m=+299.628868776" observedRunningTime="2025-12-03 20:14:25.24072935 +0000 UTC m=+300.054274313" watchObservedRunningTime="2025-12-03 20:14:25.576050232 +0000 UTC m=+300.389595185" Dec 03 20:14:25.621869 master-0 kubenswrapper[29252]: I1203 20:14:25.619199 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume\") pod \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " Dec 03 20:14:25.621869 master-0 kubenswrapper[29252]: I1203 20:14:25.619280 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume\") pod \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " Dec 03 20:14:25.621869 master-0 kubenswrapper[29252]: I1203 20:14:25.619315 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr\") pod \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\" (UID: \"06a9f48d-a9e4-4949-a094-11b5a1d07c2e\") " Dec 03 20:14:25.631794 master-0 kubenswrapper[29252]: I1203 20:14:25.625423 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "06a9f48d-a9e4-4949-a094-11b5a1d07c2e" (UID: "06a9f48d-a9e4-4949-a094-11b5a1d07c2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:25.631794 master-0 kubenswrapper[29252]: I1203 20:14:25.626819 29252 scope.go:117] "RemoveContainer" containerID="88a426b4c066f4efd6c67dba2d50d1674139b8757075139f8541302d74a32ce6" Dec 03 20:14:25.631794 master-0 kubenswrapper[29252]: I1203 20:14:25.627157 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr" (OuterVolumeSpecName: "kube-api-access-kr6zr") pod "06a9f48d-a9e4-4949-a094-11b5a1d07c2e" (UID: "06a9f48d-a9e4-4949-a094-11b5a1d07c2e"). InnerVolumeSpecName "kube-api-access-kr6zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:14:25.636957 master-0 kubenswrapper[29252]: I1203 20:14:25.636899 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06a9f48d-a9e4-4949-a094-11b5a1d07c2e" (UID: "06a9f48d-a9e4-4949-a094-11b5a1d07c2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:14:25.721649 master-0 kubenswrapper[29252]: I1203 20:14:25.720979 29252 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:25.721649 master-0 kubenswrapper[29252]: I1203 20:14:25.721009 29252 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:25.721649 master-0 kubenswrapper[29252]: I1203 20:14:25.721019 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr6zr\" (UniqueName: \"kubernetes.io/projected/06a9f48d-a9e4-4949-a094-11b5a1d07c2e-kube-api-access-kr6zr\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:25.740508 master-0 kubenswrapper[29252]: I1203 20:14:25.740459 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-mrkck"] Dec 03 20:14:25.740731 master-0 kubenswrapper[29252]: E1203 20:14:25.740669 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a9f48d-a9e4-4949-a094-11b5a1d07c2e" containerName="collect-profiles" Dec 03 20:14:25.740731 master-0 kubenswrapper[29252]: I1203 20:14:25.740681 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a9f48d-a9e4-4949-a094-11b5a1d07c2e" containerName="collect-profiles" Dec 03 20:14:25.740907 master-0 kubenswrapper[29252]: I1203 20:14:25.740807 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a9f48d-a9e4-4949-a094-11b5a1d07c2e" containerName="collect-profiles" Dec 03 20:14:25.741456 master-0 kubenswrapper[29252]: I1203 20:14:25.741436 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.744716 master-0 kubenswrapper[29252]: I1203 20:14:25.744682 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-hswpp" Dec 03 20:14:25.744922 master-0 kubenswrapper[29252]: I1203 20:14:25.744728 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 20:14:25.744922 master-0 kubenswrapper[29252]: I1203 20:14:25.744763 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 20:14:25.744922 master-0 kubenswrapper[29252]: I1203 20:14:25.744731 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 20:14:25.757755 master-0 kubenswrapper[29252]: I1203 20:14:25.757715 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-mrkck"] Dec 03 20:14:25.810855 master-0 kubenswrapper[29252]: I1203 20:14:25.810058 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:25.816424 master-0 kubenswrapper[29252]: I1203 20:14:25.813241 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:25.821722 master-0 kubenswrapper[29252]: I1203 20:14:25.821686 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.821926 master-0 kubenswrapper[29252]: I1203 20:14:25.821733 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6j66\" (UniqueName: \"kubernetes.io/projected/49913de2-24ef-452c-b82a-1f613baa7438-kube-api-access-w6j66\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.821926 master-0 kubenswrapper[29252]: I1203 20:14:25.821768 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49913de2-24ef-452c-b82a-1f613baa7438-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.821926 master-0 kubenswrapper[29252]: I1203 20:14:25.821804 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.872663 master-0 kubenswrapper[29252]: I1203 20:14:25.870928 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6f5db8559b-hd8bd"] Dec 03 20:14:25.872663 master-0 kubenswrapper[29252]: I1203 20:14:25.871835 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:14:25.874036 master-0 kubenswrapper[29252]: I1203 20:14:25.873999 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-4hc9t" Dec 03 20:14:25.879487 master-0 kubenswrapper[29252]: I1203 20:14:25.878479 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 20:14:25.879487 master-0 kubenswrapper[29252]: I1203 20:14:25.878680 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 20:14:25.890966 master-0 kubenswrapper[29252]: I1203 20:14:25.890904 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6f5db8559b-hd8bd"] Dec 03 20:14:25.923051 master-0 kubenswrapper[29252]: I1203 20:14:25.923014 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.923287 master-0 kubenswrapper[29252]: I1203 20:14:25.923274 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6j66\" (UniqueName: \"kubernetes.io/projected/49913de2-24ef-452c-b82a-1f613baa7438-kube-api-access-w6j66\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.923650 master-0 kubenswrapper[29252]: I1203 20:14:25.923635 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49913de2-24ef-452c-b82a-1f613baa7438-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.924441 master-0 kubenswrapper[29252]: I1203 20:14:25.924427 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.924764 master-0 kubenswrapper[29252]: I1203 20:14:25.924747 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh6lw\" (UniqueName: \"kubernetes.io/projected/68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64-kube-api-access-mh6lw\") pod \"downloads-6f5db8559b-hd8bd\" (UID: \"68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64\") " pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:14:25.924905 master-0 kubenswrapper[29252]: E1203 20:14:25.924693 29252 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Dec 03 20:14:25.925001 master-0 kubenswrapper[29252]: E1203 20:14:25.924991 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls podName:49913de2-24ef-452c-b82a-1f613baa7438 nodeName:}" failed. No retries permitted until 2025-12-03 20:14:26.424977404 +0000 UTC m=+301.238522357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls") pod "prometheus-operator-565bdcb8-mrkck" (UID: "49913de2-24ef-452c-b82a-1f613baa7438") : secret "prometheus-operator-tls" not found Dec 03 20:14:25.925076 master-0 kubenswrapper[29252]: I1203 20:14:25.924379 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49913de2-24ef-452c-b82a-1f613baa7438-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.926704 master-0 kubenswrapper[29252]: I1203 20:14:25.925821 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:25.946533 master-0 kubenswrapper[29252]: I1203 20:14:25.946483 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6j66\" (UniqueName: \"kubernetes.io/projected/49913de2-24ef-452c-b82a-1f613baa7438-kube-api-access-w6j66\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:26.032458 master-0 kubenswrapper[29252]: I1203 20:14:26.032384 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh6lw\" (UniqueName: \"kubernetes.io/projected/68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64-kube-api-access-mh6lw\") pod \"downloads-6f5db8559b-hd8bd\" (UID: \"68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64\") " pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:14:26.047040 master-0 kubenswrapper[29252]: I1203 20:14:26.046975 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh6lw\" (UniqueName: \"kubernetes.io/projected/68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64-kube-api-access-mh6lw\") pod \"downloads-6f5db8559b-hd8bd\" (UID: \"68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64\") " pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:14:26.136795 master-0 kubenswrapper[29252]: I1203 20:14:26.136730 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" Dec 03 20:14:26.136795 master-0 kubenswrapper[29252]: I1203 20:14:26.136737 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413200-7vcfm" event={"ID":"06a9f48d-a9e4-4949-a094-11b5a1d07c2e","Type":"ContainerDied","Data":"c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f"} Dec 03 20:14:26.136795 master-0 kubenswrapper[29252]: I1203 20:14:26.136802 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9033e6d500d3f9e1e2002f2da7d1782569d9d4bec34a41b58c3c631d43c320f" Dec 03 20:14:26.138119 master-0 kubenswrapper[29252]: I1203 20:14:26.138058 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hzx8j" event={"ID":"e394bb77-fda1-491c-b4c1-c4a2a0f35a32","Type":"ContainerStarted","Data":"c1845e519ccdcdc597be55abb90317bd6bb6847d92dd88e059f7580b6e5e7d52"} Dec 03 20:14:26.139338 master-0 kubenswrapper[29252]: I1203 20:14:26.138803 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:26.140277 master-0 kubenswrapper[29252]: I1203 20:14:26.140253 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54f97f57-l5rb2" Dec 03 20:14:26.144650 master-0 kubenswrapper[29252]: I1203 20:14:26.144613 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-77df56447c-vnp7f" Dec 03 20:14:26.164293 master-0 kubenswrapper[29252]: I1203 20:14:26.164200 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hzx8j" podStartSLOduration=2.164178735 podStartE2EDuration="2.164178735s" podCreationTimestamp="2025-12-03 20:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:26.159924212 +0000 UTC m=+300.973469165" watchObservedRunningTime="2025-12-03 20:14:26.164178735 +0000 UTC m=+300.977723688" Dec 03 20:14:26.204274 master-0 kubenswrapper[29252]: I1203 20:14:26.204174 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:14:26.337687 master-0 kubenswrapper[29252]: I1203 20:14:26.337644 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cdvg6"] Dec 03 20:14:26.340000 master-0 kubenswrapper[29252]: I1203 20:14:26.339961 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.341878 master-0 kubenswrapper[29252]: I1203 20:14:26.341734 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-mx9dn" Dec 03 20:14:26.341878 master-0 kubenswrapper[29252]: I1203 20:14:26.341735 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 20:14:26.443404 master-0 kubenswrapper[29252]: I1203 20:14:26.443290 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp28p\" (UniqueName: \"kubernetes.io/projected/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-kube-api-access-pp28p\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.443794 master-0 kubenswrapper[29252]: I1203 20:14:26.443756 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-host\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.443948 master-0 kubenswrapper[29252]: I1203 20:14:26.443929 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:26.444053 master-0 kubenswrapper[29252]: I1203 20:14:26.444037 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-serviceca\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.447363 master-0 kubenswrapper[29252]: I1203 20:14:26.447337 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/49913de2-24ef-452c-b82a-1f613baa7438-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-mrkck\" (UID: \"49913de2-24ef-452c-b82a-1f613baa7438\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:26.553877 master-0 kubenswrapper[29252]: I1203 20:14:26.550660 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-host\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.553877 master-0 kubenswrapper[29252]: I1203 20:14:26.550766 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-serviceca\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.553877 master-0 kubenswrapper[29252]: I1203 20:14:26.550833 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp28p\" (UniqueName: \"kubernetes.io/projected/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-kube-api-access-pp28p\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.553877 master-0 kubenswrapper[29252]: I1203 20:14:26.550896 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-host\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.553877 master-0 kubenswrapper[29252]: I1203 20:14:26.551543 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-serviceca\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.577599 master-0 kubenswrapper[29252]: I1203 20:14:26.577512 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp28p\" (UniqueName: \"kubernetes.io/projected/82b6f6a1-aac8-4293-bdf9-8e85ca6d5898-kube-api-access-pp28p\") pod \"node-ca-cdvg6\" (UID: \"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898\") " pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.621858 master-0 kubenswrapper[29252]: I1203 20:14:26.621817 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6f5db8559b-hd8bd"] Dec 03 20:14:26.626914 master-0 kubenswrapper[29252]: W1203 20:14:26.626857 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a9a9c5_fd3f_4a9b_ba9f_6a02dbb70f64.slice/crio-7ab7295f1315258c6ef5f40cfcff6b310b1feac2c85ce299f75b1f95b6e14e3a WatchSource:0}: Error finding container 7ab7295f1315258c6ef5f40cfcff6b310b1feac2c85ce299f75b1f95b6e14e3a: Status 404 returned error can't find the container with id 7ab7295f1315258c6ef5f40cfcff6b310b1feac2c85ce299f75b1f95b6e14e3a Dec 03 20:14:26.661299 master-0 kubenswrapper[29252]: I1203 20:14:26.661231 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" Dec 03 20:14:26.669372 master-0 kubenswrapper[29252]: I1203 20:14:26.669324 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdvg6" Dec 03 20:14:26.703486 master-0 kubenswrapper[29252]: W1203 20:14:26.703422 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82b6f6a1_aac8_4293_bdf9_8e85ca6d5898.slice/crio-b2f76a62f1a2ed2aaf1c0d432195ff0a4fb0b398030d860dca112b7e62966022 WatchSource:0}: Error finding container b2f76a62f1a2ed2aaf1c0d432195ff0a4fb0b398030d860dca112b7e62966022: Status 404 returned error can't find the container with id b2f76a62f1a2ed2aaf1c0d432195ff0a4fb0b398030d860dca112b7e62966022 Dec 03 20:14:27.099897 master-0 kubenswrapper[29252]: W1203 20:14:27.099838 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49913de2_24ef_452c_b82a_1f613baa7438.slice/crio-9f7daf153332d4aabebf46eb5c6c9c7a427ad59dd1a271c6a7eba5235319ac9c WatchSource:0}: Error finding container 9f7daf153332d4aabebf46eb5c6c9c7a427ad59dd1a271c6a7eba5235319ac9c: Status 404 returned error can't find the container with id 9f7daf153332d4aabebf46eb5c6c9c7a427ad59dd1a271c6a7eba5235319ac9c Dec 03 20:14:27.100009 master-0 kubenswrapper[29252]: I1203 20:14:27.099959 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-mrkck"] Dec 03 20:14:27.154962 master-0 kubenswrapper[29252]: I1203 20:14:27.154272 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdvg6" event={"ID":"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898","Type":"ContainerStarted","Data":"b2f76a62f1a2ed2aaf1c0d432195ff0a4fb0b398030d860dca112b7e62966022"} Dec 03 20:14:27.158125 master-0 kubenswrapper[29252]: I1203 20:14:27.158017 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6f5db8559b-hd8bd" event={"ID":"68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64","Type":"ContainerStarted","Data":"7ab7295f1315258c6ef5f40cfcff6b310b1feac2c85ce299f75b1f95b6e14e3a"} Dec 03 20:14:27.159398 master-0 kubenswrapper[29252]: I1203 20:14:27.159331 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" event={"ID":"49913de2-24ef-452c-b82a-1f613baa7438","Type":"ContainerStarted","Data":"9f7daf153332d4aabebf46eb5c6c9c7a427ad59dd1a271c6a7eba5235319ac9c"} Dec 03 20:14:29.097330 master-0 kubenswrapper[29252]: I1203 20:14:29.097263 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:14:29.099752 master-0 kubenswrapper[29252]: I1203 20:14:29.098391 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.110299 master-0 kubenswrapper[29252]: I1203 20:14:29.108118 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 20:14:29.110299 master-0 kubenswrapper[29252]: I1203 20:14:29.108528 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 20:14:29.120256 master-0 kubenswrapper[29252]: I1203 20:14:29.120224 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 20:14:29.120637 master-0 kubenswrapper[29252]: I1203 20:14:29.120620 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2bcfr" Dec 03 20:14:29.121264 master-0 kubenswrapper[29252]: I1203 20:14:29.120709 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 20:14:29.121627 master-0 kubenswrapper[29252]: I1203 20:14:29.120440 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 20:14:29.121845 master-0 kubenswrapper[29252]: I1203 20:14:29.120957 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 20:14:29.122124 master-0 kubenswrapper[29252]: I1203 20:14:29.120997 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 20:14:29.122281 master-0 kubenswrapper[29252]: I1203 20:14:29.121036 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 20:14:29.122436 master-0 kubenswrapper[29252]: I1203 20:14:29.120402 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 20:14:29.135861 master-0 kubenswrapper[29252]: I1203 20:14:29.130893 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 20:14:29.135861 master-0 kubenswrapper[29252]: I1203 20:14:29.131839 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 20:14:29.135861 master-0 kubenswrapper[29252]: I1203 20:14:29.134797 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 20:14:29.142750 master-0 kubenswrapper[29252]: I1203 20:14:29.142681 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 20:14:29.144023 master-0 kubenswrapper[29252]: I1203 20:14:29.143410 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.180787 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.180835 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.180860 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.180879 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181090 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181118 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181142 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj7sv\" (UniqueName: \"kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181165 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181184 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181201 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181217 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181233 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.192091 master-0 kubenswrapper[29252]: I1203 20:14:29.181254 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.276541 master-0 kubenswrapper[29252]: I1203 20:14:29.276427 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:14:29.278247 master-0 kubenswrapper[29252]: I1203 20:14:29.277882 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.280301 master-0 kubenswrapper[29252]: I1203 20:14:29.280259 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 20:14:29.280637 master-0 kubenswrapper[29252]: I1203 20:14:29.280594 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-brhmz" Dec 03 20:14:29.280741 master-0 kubenswrapper[29252]: I1203 20:14:29.280718 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 20:14:29.280901 master-0 kubenswrapper[29252]: I1203 20:14:29.280861 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 20:14:29.281007 master-0 kubenswrapper[29252]: I1203 20:14:29.280988 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282184 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282687 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282723 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282750 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282796 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282823 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282863 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282892 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj7sv\" (UniqueName: \"kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282919 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282953 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282974 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.282994 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.283018 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.283047 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.283416 master-0 kubenswrapper[29252]: I1203 20:14:29.283070 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.284299 master-0 kubenswrapper[29252]: I1203 20:14:29.284273 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.284942 master-0 kubenswrapper[29252]: I1203 20:14:29.284891 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.286317 master-0 kubenswrapper[29252]: I1203 20:14:29.286266 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.287491 master-0 kubenswrapper[29252]: I1203 20:14:29.287441 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.287999 master-0 kubenswrapper[29252]: I1203 20:14:29.287964 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.289795 master-0 kubenswrapper[29252]: I1203 20:14:29.289735 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.290810 master-0 kubenswrapper[29252]: I1203 20:14:29.290742 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.291229 master-0 kubenswrapper[29252]: I1203 20:14:29.291172 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.291545 master-0 kubenswrapper[29252]: I1203 20:14:29.291508 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.292220 master-0 kubenswrapper[29252]: I1203 20:14:29.292183 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.294500 master-0 kubenswrapper[29252]: I1203 20:14:29.294445 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.298625 master-0 kubenswrapper[29252]: I1203 20:14:29.298581 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:14:29.305035 master-0 kubenswrapper[29252]: I1203 20:14:29.304852 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj7sv\" (UniqueName: \"kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv\") pod \"oauth-openshift-65d8f97447-xswx9\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.384143 master-0 kubenswrapper[29252]: I1203 20:14:29.384078 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.384143 master-0 kubenswrapper[29252]: I1203 20:14:29.384143 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.384807 master-0 kubenswrapper[29252]: I1203 20:14:29.384185 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.384807 master-0 kubenswrapper[29252]: I1203 20:14:29.384208 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.384807 master-0 kubenswrapper[29252]: I1203 20:14:29.384256 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6k8b\" (UniqueName: \"kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.384807 master-0 kubenswrapper[29252]: I1203 20:14:29.384288 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.429317 master-0 kubenswrapper[29252]: I1203 20:14:29.429252 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:29.487409 master-0 kubenswrapper[29252]: I1203 20:14:29.487353 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.487618 master-0 kubenswrapper[29252]: I1203 20:14:29.487595 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.487703 master-0 kubenswrapper[29252]: I1203 20:14:29.487679 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.488805 master-0 kubenswrapper[29252]: I1203 20:14:29.487751 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.488805 master-0 kubenswrapper[29252]: I1203 20:14:29.487912 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6k8b\" (UniqueName: \"kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.488805 master-0 kubenswrapper[29252]: I1203 20:14:29.487945 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.488805 master-0 kubenswrapper[29252]: I1203 20:14:29.488536 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.488805 master-0 kubenswrapper[29252]: I1203 20:14:29.488732 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.489046 master-0 kubenswrapper[29252]: I1203 20:14:29.488828 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.490856 master-0 kubenswrapper[29252]: I1203 20:14:29.490819 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.490956 master-0 kubenswrapper[29252]: I1203 20:14:29.490934 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.504926 master-0 kubenswrapper[29252]: I1203 20:14:29.504871 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6k8b\" (UniqueName: \"kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b\") pod \"console-5b6f946576-zgpxr\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.646303 master-0 kubenswrapper[29252]: I1203 20:14:29.645972 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:29.842856 master-0 kubenswrapper[29252]: I1203 20:14:29.836255 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:14:30.115257 master-0 kubenswrapper[29252]: I1203 20:14:30.115218 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:14:30.120049 master-0 kubenswrapper[29252]: W1203 20:14:30.120000 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f0cc6e_2015_4c7e_848f_ccca37ad61c4.slice/crio-8ab6f76bd1e5bfd809855b37cc1500209792bebbeb2a5ca5cd9846c952e06d3f WatchSource:0}: Error finding container 8ab6f76bd1e5bfd809855b37cc1500209792bebbeb2a5ca5cd9846c952e06d3f: Status 404 returned error can't find the container with id 8ab6f76bd1e5bfd809855b37cc1500209792bebbeb2a5ca5cd9846c952e06d3f Dec 03 20:14:30.181957 master-0 kubenswrapper[29252]: I1203 20:14:30.181894 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b6f946576-zgpxr" event={"ID":"33f0cc6e-2015-4c7e-848f-ccca37ad61c4","Type":"ContainerStarted","Data":"8ab6f76bd1e5bfd809855b37cc1500209792bebbeb2a5ca5cd9846c952e06d3f"} Dec 03 20:14:30.183802 master-0 kubenswrapper[29252]: I1203 20:14:30.183738 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" event={"ID":"49913de2-24ef-452c-b82a-1f613baa7438","Type":"ContainerStarted","Data":"c366ee981e17b9b2e1a3b09a55c9355fe28eb3c02350b70b00a0ed91c88cee9a"} Dec 03 20:14:30.183802 master-0 kubenswrapper[29252]: I1203 20:14:30.183802 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" event={"ID":"49913de2-24ef-452c-b82a-1f613baa7438","Type":"ContainerStarted","Data":"6552c8f397e8dd0e4f4f1c4003150e3166d518dee39c3f4bb54eae1e288c64af"} Dec 03 20:14:30.185319 master-0 kubenswrapper[29252]: I1203 20:14:30.185285 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdvg6" event={"ID":"82b6f6a1-aac8-4293-bdf9-8e85ca6d5898","Type":"ContainerStarted","Data":"2b1a599ff9fee57a40cad550089699aa7ca17f740d1ef117af0754576366fcb1"} Dec 03 20:14:30.186295 master-0 kubenswrapper[29252]: I1203 20:14:30.186255 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" event={"ID":"8db6fff6-9e07-4c7d-97b7-dea394f706c6","Type":"ContainerStarted","Data":"9243afb7e08524d8a4682638e082352b84f1ae98af760a018077fe7f49f550aa"} Dec 03 20:14:30.206744 master-0 kubenswrapper[29252]: I1203 20:14:30.206670 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-565bdcb8-mrkck" podStartSLOduration=2.864366208 podStartE2EDuration="5.206650114s" podCreationTimestamp="2025-12-03 20:14:25 +0000 UTC" firstStartedPulling="2025-12-03 20:14:27.102230805 +0000 UTC m=+301.915775758" lastFinishedPulling="2025-12-03 20:14:29.444514711 +0000 UTC m=+304.258059664" observedRunningTime="2025-12-03 20:14:30.199144952 +0000 UTC m=+305.012689905" watchObservedRunningTime="2025-12-03 20:14:30.206650114 +0000 UTC m=+305.020195067" Dec 03 20:14:30.215020 master-0 kubenswrapper[29252]: I1203 20:14:30.214961 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cdvg6" podStartSLOduration=1.488974274 podStartE2EDuration="4.214943675s" podCreationTimestamp="2025-12-03 20:14:26 +0000 UTC" firstStartedPulling="2025-12-03 20:14:26.707214475 +0000 UTC m=+301.520759428" lastFinishedPulling="2025-12-03 20:14:29.433183886 +0000 UTC m=+304.246728829" observedRunningTime="2025-12-03 20:14:30.213390238 +0000 UTC m=+305.026935221" watchObservedRunningTime="2025-12-03 20:14:30.214943675 +0000 UTC m=+305.028488628" Dec 03 20:14:31.923371 master-0 kubenswrapper[29252]: I1203 20:14:31.923138 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:14:31.924421 master-0 kubenswrapper[29252]: I1203 20:14:31.924404 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:31.930804 master-0 kubenswrapper[29252]: I1203 20:14:31.930751 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:14:31.932456 master-0 kubenswrapper[29252]: I1203 20:14:31.932417 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 20:14:32.046466 master-0 kubenswrapper[29252]: I1203 20:14:32.046391 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046466 master-0 kubenswrapper[29252]: I1203 20:14:32.046444 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046466 master-0 kubenswrapper[29252]: I1203 20:14:32.046467 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046758 master-0 kubenswrapper[29252]: I1203 20:14:32.046489 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046758 master-0 kubenswrapper[29252]: I1203 20:14:32.046507 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcr9\" (UniqueName: \"kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046758 master-0 kubenswrapper[29252]: I1203 20:14:32.046548 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.046758 master-0 kubenswrapper[29252]: I1203 20:14:32.046575 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147790 master-0 kubenswrapper[29252]: I1203 20:14:32.147719 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147968 master-0 kubenswrapper[29252]: I1203 20:14:32.147873 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147968 master-0 kubenswrapper[29252]: I1203 20:14:32.147910 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147968 master-0 kubenswrapper[29252]: I1203 20:14:32.147926 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147968 master-0 kubenswrapper[29252]: I1203 20:14:32.147950 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.147968 master-0 kubenswrapper[29252]: I1203 20:14:32.147970 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.148140 master-0 kubenswrapper[29252]: I1203 20:14:32.147989 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcr9\" (UniqueName: \"kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.151699 master-0 kubenswrapper[29252]: I1203 20:14:32.151661 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.152628 master-0 kubenswrapper[29252]: I1203 20:14:32.152594 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.153219 master-0 kubenswrapper[29252]: I1203 20:14:32.153200 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.153668 master-0 kubenswrapper[29252]: I1203 20:14:32.153626 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.163603 master-0 kubenswrapper[29252]: I1203 20:14:32.156142 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.172451 master-0 kubenswrapper[29252]: I1203 20:14:32.172404 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.179661 master-0 kubenswrapper[29252]: I1203 20:14:32.179558 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-x2wbs"] Dec 03 20:14:32.193269 master-0 kubenswrapper[29252]: I1203 20:14:32.192024 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcr9\" (UniqueName: \"kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9\") pod \"console-6fbbdc9bc8-g94br\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.216077 master-0 kubenswrapper[29252]: I1203 20:14:32.213710 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k"] Dec 03 20:14:32.220621 master-0 kubenswrapper[29252]: I1203 20:14:32.218188 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.221707 master-0 kubenswrapper[29252]: I1203 20:14:32.221504 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.225029 master-0 kubenswrapper[29252]: I1203 20:14:32.224158 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k"] Dec 03 20:14:32.225029 master-0 kubenswrapper[29252]: I1203 20:14:32.224314 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-2zv7f" Dec 03 20:14:32.225029 master-0 kubenswrapper[29252]: I1203 20:14:32.224366 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-4tvlh" Dec 03 20:14:32.225029 master-0 kubenswrapper[29252]: I1203 20:14:32.224545 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 20:14:32.225201 master-0 kubenswrapper[29252]: I1203 20:14:32.225047 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 20:14:32.226207 master-0 kubenswrapper[29252]: I1203 20:14:32.225617 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 20:14:32.226207 master-0 kubenswrapper[29252]: I1203 20:14:32.225743 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 20:14:32.228496 master-0 kubenswrapper[29252]: I1203 20:14:32.228454 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr"] Dec 03 20:14:32.235096 master-0 kubenswrapper[29252]: I1203 20:14:32.234429 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.237503 master-0 kubenswrapper[29252]: I1203 20:14:32.237469 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 20:14:32.239953 master-0 kubenswrapper[29252]: I1203 20:14:32.239839 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 20:14:32.241851 master-0 kubenswrapper[29252]: I1203 20:14:32.240058 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 20:14:32.241851 master-0 kubenswrapper[29252]: I1203 20:14:32.240205 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr"] Dec 03 20:14:32.241851 master-0 kubenswrapper[29252]: I1203 20:14:32.240216 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9xvtd" Dec 03 20:14:32.250798 master-0 kubenswrapper[29252]: I1203 20:14:32.250666 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.250798 master-0 kubenswrapper[29252]: I1203 20:14:32.250727 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-sys\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.250798 master-0 kubenswrapper[29252]: I1203 20:14:32.250769 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwr87\" (UniqueName: \"kubernetes.io/projected/70f550ce-35e6-482b-a7ff-4a8c11569406-kube-api-access-pwr87\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.250814 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-root\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.250877 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70f550ce-35e6-482b-a7ff-4a8c11569406-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.250903 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-tls\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.250967 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.251023 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-textfile\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.251090 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-wtmp\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.251115 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f08828-d22f-48a0-b247-fbe323742568-metrics-client-ca\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.251174 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.255369 master-0 kubenswrapper[29252]: I1203 20:14:32.251249 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cn5\" (UniqueName: \"kubernetes.io/projected/89f08828-d22f-48a0-b247-fbe323742568-kube-api-access-b9cn5\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.257004 master-0 kubenswrapper[29252]: I1203 20:14:32.256961 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353060 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwr87\" (UniqueName: \"kubernetes.io/projected/70f550ce-35e6-482b-a7ff-4a8c11569406-kube-api-access-pwr87\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353133 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353174 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28e4a5ec-9304-475d-8321-13b21985d688-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353203 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-root\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353237 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353263 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70f550ce-35e6-482b-a7ff-4a8c11569406-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353288 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-tls\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353315 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-wtmp\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353331 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353351 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-textfile\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353374 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f08828-d22f-48a0-b247-fbe323742568-metrics-client-ca\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353398 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353422 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cn5\" (UniqueName: \"kubernetes.io/projected/89f08828-d22f-48a0-b247-fbe323742568-kube-api-access-b9cn5\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353454 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353475 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skpnj\" (UniqueName: \"kubernetes.io/projected/28e4a5ec-9304-475d-8321-13b21985d688-kube-api-access-skpnj\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353510 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353535 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353556 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-sys\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.353875 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-sys\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.355304 master-0 kubenswrapper[29252]: I1203 20:14:32.354262 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-root\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.356296 master-0 kubenswrapper[29252]: I1203 20:14:32.356245 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70f550ce-35e6-482b-a7ff-4a8c11569406-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.358019 master-0 kubenswrapper[29252]: I1203 20:14:32.356703 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-textfile\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.358019 master-0 kubenswrapper[29252]: I1203 20:14:32.356973 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f08828-d22f-48a0-b247-fbe323742568-metrics-client-ca\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.358264 master-0 kubenswrapper[29252]: I1203 20:14:32.358233 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.358363 master-0 kubenswrapper[29252]: I1203 20:14:32.358346 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-wtmp\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.358552 master-0 kubenswrapper[29252]: I1203 20:14:32.358496 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-tls\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.368593 master-0 kubenswrapper[29252]: I1203 20:14:32.361238 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f08828-d22f-48a0-b247-fbe323742568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.368593 master-0 kubenswrapper[29252]: I1203 20:14:32.368469 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/70f550ce-35e6-482b-a7ff-4a8c11569406-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.374310 master-0 kubenswrapper[29252]: I1203 20:14:32.374155 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwr87\" (UniqueName: \"kubernetes.io/projected/70f550ce-35e6-482b-a7ff-4a8c11569406-kube-api-access-pwr87\") pod \"openshift-state-metrics-57cbc648f8-4j26k\" (UID: \"70f550ce-35e6-482b-a7ff-4a8c11569406\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.374310 master-0 kubenswrapper[29252]: I1203 20:14:32.374251 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cn5\" (UniqueName: \"kubernetes.io/projected/89f08828-d22f-48a0-b247-fbe323742568-kube-api-access-b9cn5\") pod \"node-exporter-x2wbs\" (UID: \"89f08828-d22f-48a0-b247-fbe323742568\") " pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456066 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456164 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skpnj\" (UniqueName: \"kubernetes.io/projected/28e4a5ec-9304-475d-8321-13b21985d688-kube-api-access-skpnj\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456243 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456312 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456364 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28e4a5ec-9304-475d-8321-13b21985d688-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.456435 master-0 kubenswrapper[29252]: I1203 20:14:32.456429 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.458861 master-0 kubenswrapper[29252]: I1203 20:14:32.458802 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.460131 master-0 kubenswrapper[29252]: I1203 20:14:32.459420 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28e4a5ec-9304-475d-8321-13b21985d688-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.465561 master-0 kubenswrapper[29252]: I1203 20:14:32.463085 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28e4a5ec-9304-475d-8321-13b21985d688-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.467992 master-0 kubenswrapper[29252]: I1203 20:14:32.467957 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.470189 master-0 kubenswrapper[29252]: I1203 20:14:32.470151 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28e4a5ec-9304-475d-8321-13b21985d688-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.479315 master-0 kubenswrapper[29252]: I1203 20:14:32.479033 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skpnj\" (UniqueName: \"kubernetes.io/projected/28e4a5ec-9304-475d-8321-13b21985d688-kube-api-access-skpnj\") pod \"kube-state-metrics-7dcc7f9bd6-wj8nr\" (UID: \"28e4a5ec-9304-475d-8321-13b21985d688\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:32.545938 master-0 kubenswrapper[29252]: I1203 20:14:32.545892 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x2wbs" Dec 03 20:14:32.565334 master-0 kubenswrapper[29252]: I1203 20:14:32.565251 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" Dec 03 20:14:32.604364 master-0 kubenswrapper[29252]: I1203 20:14:32.604297 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.274944 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.277569 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.279137 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.279328 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.279490 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.279649 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.279930 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.280076 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.280476 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-kfnzv" Dec 03 20:14:33.281009 master-0 kubenswrapper[29252]: I1203 20:14:33.280598 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 20:14:33.281879 master-0 kubenswrapper[29252]: I1203 20:14:33.281763 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 20:14:33.304887 master-0 kubenswrapper[29252]: I1203 20:14:33.303932 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:14:33.375501 master-0 kubenswrapper[29252]: I1203 20:14:33.375447 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375747 master-0 kubenswrapper[29252]: I1203 20:14:33.375577 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375747 master-0 kubenswrapper[29252]: I1203 20:14:33.375632 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375747 master-0 kubenswrapper[29252]: I1203 20:14:33.375667 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375747 master-0 kubenswrapper[29252]: I1203 20:14:33.375714 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375747 master-0 kubenswrapper[29252]: I1203 20:14:33.375736 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375923 master-0 kubenswrapper[29252]: I1203 20:14:33.375813 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375923 master-0 kubenswrapper[29252]: I1203 20:14:33.375854 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375923 master-0 kubenswrapper[29252]: I1203 20:14:33.375869 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.375923 master-0 kubenswrapper[29252]: I1203 20:14:33.375908 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.376036 master-0 kubenswrapper[29252]: I1203 20:14:33.375958 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.376091 master-0 kubenswrapper[29252]: I1203 20:14:33.376043 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgrr\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.477882 master-0 kubenswrapper[29252]: I1203 20:14:33.477838 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.477882 master-0 kubenswrapper[29252]: I1203 20:14:33.477892 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.477914 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.477939 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.477954 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.477970 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twgrr\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.478045 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.478069 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.478102 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478145 master-0 kubenswrapper[29252]: I1203 20:14:33.478119 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478458 master-0 kubenswrapper[29252]: I1203 20:14:33.478156 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.478458 master-0 kubenswrapper[29252]: I1203 20:14:33.478175 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.479950 master-0 kubenswrapper[29252]: I1203 20:14:33.479889 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.481492 master-0 kubenswrapper[29252]: I1203 20:14:33.481448 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.481663 master-0 kubenswrapper[29252]: I1203 20:14:33.481501 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.482091 master-0 kubenswrapper[29252]: I1203 20:14:33.482068 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.483067 master-0 kubenswrapper[29252]: I1203 20:14:33.483030 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.483499 master-0 kubenswrapper[29252]: I1203 20:14:33.483462 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.483874 master-0 kubenswrapper[29252]: I1203 20:14:33.483834 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.484576 master-0 kubenswrapper[29252]: I1203 20:14:33.484541 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.486446 master-0 kubenswrapper[29252]: I1203 20:14:33.485234 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.486446 master-0 kubenswrapper[29252]: I1203 20:14:33.485977 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.486446 master-0 kubenswrapper[29252]: I1203 20:14:33.486282 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.496850 master-0 kubenswrapper[29252]: I1203 20:14:33.496799 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twgrr\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr\") pod \"alertmanager-main-0\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.603562 master-0 kubenswrapper[29252]: I1203 20:14:33.603419 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:14:33.951937 master-0 kubenswrapper[29252]: I1203 20:14:33.951873 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 20:14:33.952229 master-0 kubenswrapper[29252]: I1203 20:14:33.952090 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerName="controller-manager" containerID="cri-o://263c9892d0db3a282cd4fd76feedfc7a2f00079133490560ffda1c5aceb719de" gracePeriod=30 Dec 03 20:14:34.049337 master-0 kubenswrapper[29252]: I1203 20:14:34.045492 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:14:34.049337 master-0 kubenswrapper[29252]: I1203 20:14:34.045680 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" podUID="c52974d8-fbe6-444b-97ae-468482eebac8" containerName="route-controller-manager" containerID="cri-o://fb26888be03097a323ec8a570f669f63df9f46f37a911bd2cc4812c68e4c8b64" gracePeriod=30 Dec 03 20:14:34.153652 master-0 kubenswrapper[29252]: I1203 20:14:34.153586 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-54c84f8475-9xl5s"] Dec 03 20:14:34.156368 master-0 kubenswrapper[29252]: I1203 20:14:34.155380 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.157209 master-0 kubenswrapper[29252]: I1203 20:14:34.156705 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-a6k1coh2n07mf" Dec 03 20:14:34.157209 master-0 kubenswrapper[29252]: I1203 20:14:34.157198 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 03 20:14:34.157324 master-0 kubenswrapper[29252]: I1203 20:14:34.157210 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 03 20:14:34.165851 master-0 kubenswrapper[29252]: I1203 20:14:34.165264 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 03 20:14:34.165851 master-0 kubenswrapper[29252]: I1203 20:14:34.165471 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 03 20:14:34.165851 master-0 kubenswrapper[29252]: I1203 20:14:34.165565 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-6bdg2" Dec 03 20:14:34.168230 master-0 kubenswrapper[29252]: I1203 20:14:34.168193 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 03 20:14:34.171960 master-0 kubenswrapper[29252]: I1203 20:14:34.171921 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54c84f8475-9xl5s"] Dec 03 20:14:34.251915 master-0 kubenswrapper[29252]: I1203 20:14:34.248412 29252 generic.go:334] "Generic (PLEG): container finished" podID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerID="263c9892d0db3a282cd4fd76feedfc7a2f00079133490560ffda1c5aceb719de" exitCode=0 Dec 03 20:14:34.251915 master-0 kubenswrapper[29252]: I1203 20:14:34.248767 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" event={"ID":"1c22cb59-5083-4be6-9998-a9e67a2c20cd","Type":"ContainerDied","Data":"263c9892d0db3a282cd4fd76feedfc7a2f00079133490560ffda1c5aceb719de"} Dec 03 20:14:34.261416 master-0 kubenswrapper[29252]: I1203 20:14:34.261367 29252 generic.go:334] "Generic (PLEG): container finished" podID="c52974d8-fbe6-444b-97ae-468482eebac8" containerID="fb26888be03097a323ec8a570f669f63df9f46f37a911bd2cc4812c68e4c8b64" exitCode=0 Dec 03 20:14:34.261568 master-0 kubenswrapper[29252]: I1203 20:14:34.261421 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" event={"ID":"c52974d8-fbe6-444b-97ae-468482eebac8","Type":"ContainerDied","Data":"fb26888be03097a323ec8a570f669f63df9f46f37a911bd2cc4812c68e4c8b64"} Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297113 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297182 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297211 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/922b1203-f140-45bb-94a2-6efb31cf5ee8-metrics-client-ca\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297241 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-grpc-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297408 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297516 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6b54\" (UniqueName: \"kubernetes.io/projected/922b1203-f140-45bb-94a2-6efb31cf5ee8-kube-api-access-t6b54\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297608 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.297760 master-0 kubenswrapper[29252]: I1203 20:14:34.297633 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.314813 master-0 kubenswrapper[29252]: I1203 20:14:34.314659 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:14:34.398530 master-0 kubenswrapper[29252]: I1203 20:14:34.398439 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398530 master-0 kubenswrapper[29252]: I1203 20:14:34.398503 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398530 master-0 kubenswrapper[29252]: I1203 20:14:34.398528 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/922b1203-f140-45bb-94a2-6efb31cf5ee8-metrics-client-ca\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398865 master-0 kubenswrapper[29252]: I1203 20:14:34.398554 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-grpc-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398865 master-0 kubenswrapper[29252]: I1203 20:14:34.398579 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398865 master-0 kubenswrapper[29252]: I1203 20:14:34.398609 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6b54\" (UniqueName: \"kubernetes.io/projected/922b1203-f140-45bb-94a2-6efb31cf5ee8-kube-api-access-t6b54\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398865 master-0 kubenswrapper[29252]: I1203 20:14:34.398650 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.398865 master-0 kubenswrapper[29252]: I1203 20:14:34.398675 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.400412 master-0 kubenswrapper[29252]: I1203 20:14:34.400346 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/922b1203-f140-45bb-94a2-6efb31cf5ee8-metrics-client-ca\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.402879 master-0 kubenswrapper[29252]: I1203 20:14:34.402832 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.403003 master-0 kubenswrapper[29252]: I1203 20:14:34.402976 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.403196 master-0 kubenswrapper[29252]: I1203 20:14:34.403162 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.404287 master-0 kubenswrapper[29252]: I1203 20:14:34.404260 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.404618 master-0 kubenswrapper[29252]: I1203 20:14:34.404571 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-thanos-querier-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.413468 master-0 kubenswrapper[29252]: I1203 20:14:34.413435 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/922b1203-f140-45bb-94a2-6efb31cf5ee8-secret-grpc-tls\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.416679 master-0 kubenswrapper[29252]: I1203 20:14:34.416639 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6b54\" (UniqueName: \"kubernetes.io/projected/922b1203-f140-45bb-94a2-6efb31cf5ee8-kube-api-access-t6b54\") pod \"thanos-querier-54c84f8475-9xl5s\" (UID: \"922b1203-f140-45bb-94a2-6efb31cf5ee8\") " pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:34.545861 master-0 kubenswrapper[29252]: I1203 20:14:34.541061 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:35.192132 master-0 kubenswrapper[29252]: I1203 20:14:35.192077 29252 patch_prober.go:28] interesting pod/controller-manager-ff788744d-hkt6c container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.65:8443/healthz\": dial tcp 10.128.0.65:8443: connect: connection refused" start-of-body= Dec 03 20:14:35.192364 master-0 kubenswrapper[29252]: I1203 20:14:35.192145 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.65:8443/healthz\": dial tcp 10.128.0.65:8443: connect: connection refused" Dec 03 20:14:35.390128 master-0 kubenswrapper[29252]: W1203 20:14:35.390069 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f08828_d22f_48a0_b247_fbe323742568.slice/crio-6c9d812506d3ca1b12698e8efee8179c9eb0080bab6c8fbe738f1032432decd4 WatchSource:0}: Error finding container 6c9d812506d3ca1b12698e8efee8179c9eb0080bab6c8fbe738f1032432decd4: Status 404 returned error can't find the container with id 6c9d812506d3ca1b12698e8efee8179c9eb0080bab6c8fbe738f1032432decd4 Dec 03 20:14:35.755147 master-0 kubenswrapper[29252]: I1203 20:14:35.755106 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:14:35.804690 master-0 kubenswrapper[29252]: I1203 20:14:35.804585 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc"] Dec 03 20:14:35.805173 master-0 kubenswrapper[29252]: E1203 20:14:35.805132 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52974d8-fbe6-444b-97ae-468482eebac8" containerName="route-controller-manager" Dec 03 20:14:35.805173 master-0 kubenswrapper[29252]: I1203 20:14:35.805159 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52974d8-fbe6-444b-97ae-468482eebac8" containerName="route-controller-manager" Dec 03 20:14:35.805458 master-0 kubenswrapper[29252]: I1203 20:14:35.805418 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52974d8-fbe6-444b-97ae-468482eebac8" containerName="route-controller-manager" Dec 03 20:14:35.806660 master-0 kubenswrapper[29252]: I1203 20:14:35.806602 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:35.809370 master-0 kubenswrapper[29252]: I1203 20:14:35.809304 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-gcvgk" Dec 03 20:14:35.812473 master-0 kubenswrapper[29252]: I1203 20:14:35.812419 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:14:35.813347 master-0 kubenswrapper[29252]: I1203 20:14:35.813296 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc"] Dec 03 20:14:35.834634 master-0 kubenswrapper[29252]: I1203 20:14:35.834580 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") pod \"c52974d8-fbe6-444b-97ae-468482eebac8\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " Dec 03 20:14:35.834967 master-0 kubenswrapper[29252]: I1203 20:14:35.834705 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") pod \"c52974d8-fbe6-444b-97ae-468482eebac8\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " Dec 03 20:14:35.834967 master-0 kubenswrapper[29252]: I1203 20:14:35.834814 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") pod \"c52974d8-fbe6-444b-97ae-468482eebac8\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " Dec 03 20:14:35.834967 master-0 kubenswrapper[29252]: I1203 20:14:35.834850 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") pod \"c52974d8-fbe6-444b-97ae-468482eebac8\" (UID: \"c52974d8-fbe6-444b-97ae-468482eebac8\") " Dec 03 20:14:35.835759 master-0 kubenswrapper[29252]: I1203 20:14:35.835733 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config" (OuterVolumeSpecName: "config") pod "c52974d8-fbe6-444b-97ae-468482eebac8" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:35.836385 master-0 kubenswrapper[29252]: I1203 20:14:35.836352 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca" (OuterVolumeSpecName: "client-ca") pod "c52974d8-fbe6-444b-97ae-468482eebac8" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:35.838619 master-0 kubenswrapper[29252]: I1203 20:14:35.838561 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c52974d8-fbe6-444b-97ae-468482eebac8" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:14:35.839046 master-0 kubenswrapper[29252]: I1203 20:14:35.838959 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl" (OuterVolumeSpecName: "kube-api-access-p7vxl") pod "c52974d8-fbe6-444b-97ae-468482eebac8" (UID: "c52974d8-fbe6-444b-97ae-468482eebac8"). InnerVolumeSpecName "kube-api-access-p7vxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:14:35.919666 master-0 kubenswrapper[29252]: I1203 20:14:35.919343 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k"] Dec 03 20:14:35.926092 master-0 kubenswrapper[29252]: I1203 20:14:35.926054 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:14:35.933609 master-0 kubenswrapper[29252]: W1203 20:14:35.933562 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6fb6ec_be7b_4987_a6dd_51ccb45e2b1d.slice/crio-458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3 WatchSource:0}: Error finding container 458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3: Status 404 returned error can't find the container with id 458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3 Dec 03 20:14:35.935446 master-0 kubenswrapper[29252]: I1203 20:14:35.935410 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") pod \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " Dec 03 20:14:35.935557 master-0 kubenswrapper[29252]: I1203 20:14:35.935526 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") pod \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " Dec 03 20:14:35.935619 master-0 kubenswrapper[29252]: I1203 20:14:35.935604 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") pod \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " Dec 03 20:14:35.936236 master-0 kubenswrapper[29252]: I1203 20:14:35.936202 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c22cb59-5083-4be6-9998-a9e67a2c20cd" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:35.936945 master-0 kubenswrapper[29252]: I1203 20:14:35.936911 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") pod \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " Dec 03 20:14:35.936997 master-0 kubenswrapper[29252]: I1203 20:14:35.936961 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") pod \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\" (UID: \"1c22cb59-5083-4be6-9998-a9e67a2c20cd\") " Dec 03 20:14:35.937188 master-0 kubenswrapper[29252]: I1203 20:14:35.937138 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-config\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:35.937223 master-0 kubenswrapper[29252]: I1203 20:14:35.937198 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-client-ca\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:35.937291 master-0 kubenswrapper[29252]: I1203 20:14:35.937233 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknkt\" (UniqueName: \"kubernetes.io/projected/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-kube-api-access-dknkt\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:35.937323 master-0 kubenswrapper[29252]: I1203 20:14:35.937299 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-serving-cert\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:35.937383 master-0 kubenswrapper[29252]: I1203 20:14:35.937364 29252 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:35.937417 master-0 kubenswrapper[29252]: I1203 20:14:35.937384 29252 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52974d8-fbe6-444b-97ae-468482eebac8-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:35.937417 master-0 kubenswrapper[29252]: I1203 20:14:35.937396 29252 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52974d8-fbe6-444b-97ae-468482eebac8-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:35.937417 master-0 kubenswrapper[29252]: I1203 20:14:35.937408 29252 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:35.937506 master-0 kubenswrapper[29252]: I1203 20:14:35.937419 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7vxl\" (UniqueName: \"kubernetes.io/projected/c52974d8-fbe6-444b-97ae-468482eebac8-kube-api-access-p7vxl\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:35.937873 master-0 kubenswrapper[29252]: I1203 20:14:35.937847 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1c22cb59-5083-4be6-9998-a9e67a2c20cd" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:35.938406 master-0 kubenswrapper[29252]: I1203 20:14:35.938368 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config" (OuterVolumeSpecName: "config") pod "1c22cb59-5083-4be6-9998-a9e67a2c20cd" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:14:35.941171 master-0 kubenswrapper[29252]: I1203 20:14:35.941126 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c22cb59-5083-4be6-9998-a9e67a2c20cd" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:14:35.941582 master-0 kubenswrapper[29252]: I1203 20:14:35.941539 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:14:35.942339 master-0 kubenswrapper[29252]: I1203 20:14:35.942206 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn" (OuterVolumeSpecName: "kube-api-access-7cnmn") pod "1c22cb59-5083-4be6-9998-a9e67a2c20cd" (UID: "1c22cb59-5083-4be6-9998-a9e67a2c20cd"). InnerVolumeSpecName "kube-api-access-7cnmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:14:36.038359 master-0 kubenswrapper[29252]: I1203 20:14:36.038299 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-serving-cert\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038399 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-config\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038426 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-client-ca\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038453 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknkt\" (UniqueName: \"kubernetes.io/projected/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-kube-api-access-dknkt\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038533 29252 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c22cb59-5083-4be6-9998-a9e67a2c20cd-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038545 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cnmn\" (UniqueName: \"kubernetes.io/projected/1c22cb59-5083-4be6-9998-a9e67a2c20cd-kube-api-access-7cnmn\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:36.038554 master-0 kubenswrapper[29252]: I1203 20:14:36.038556 29252 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:36.038831 master-0 kubenswrapper[29252]: I1203 20:14:36.038566 29252 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c22cb59-5083-4be6-9998-a9e67a2c20cd-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:14:36.039575 master-0 kubenswrapper[29252]: I1203 20:14:36.039549 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-client-ca\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.040021 master-0 kubenswrapper[29252]: I1203 20:14:36.039993 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-config\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.048945 master-0 kubenswrapper[29252]: I1203 20:14:36.048327 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-serving-cert\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.057191 master-0 kubenswrapper[29252]: I1203 20:14:36.057132 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknkt\" (UniqueName: \"kubernetes.io/projected/bbdb41ba-89c8-430d-a4a8-5821cebbe2e0-kube-api-access-dknkt\") pod \"route-controller-manager-f66884f7c-299nc\" (UID: \"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0\") " pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.132864 master-0 kubenswrapper[29252]: I1203 20:14:36.132814 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:36.277726 master-0 kubenswrapper[29252]: I1203 20:14:36.277646 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" event={"ID":"70f550ce-35e6-482b-a7ff-4a8c11569406","Type":"ContainerStarted","Data":"7f8c1328d3df3dfbe4161ecc62f06e5fb90ba9e1306abf543bcdbf01a0dea85f"} Dec 03 20:14:36.278763 master-0 kubenswrapper[29252]: I1203 20:14:36.278738 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3"} Dec 03 20:14:36.280835 master-0 kubenswrapper[29252]: I1203 20:14:36.280800 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" Dec 03 20:14:36.281033 master-0 kubenswrapper[29252]: I1203 20:14:36.280767 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ff788744d-hkt6c" event={"ID":"1c22cb59-5083-4be6-9998-a9e67a2c20cd","Type":"ContainerDied","Data":"bb807fb004e1c5a8c12ce908fa4f2effefa5e62f25142bb2fe3ec8dd74d140f1"} Dec 03 20:14:36.281125 master-0 kubenswrapper[29252]: I1203 20:14:36.281093 29252 scope.go:117] "RemoveContainer" containerID="263c9892d0db3a282cd4fd76feedfc7a2f00079133490560ffda1c5aceb719de" Dec 03 20:14:36.283377 master-0 kubenswrapper[29252]: I1203 20:14:36.283224 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" event={"ID":"8db6fff6-9e07-4c7d-97b7-dea394f706c6","Type":"ContainerStarted","Data":"99029e5e551e0cf40043fe1556eb3f5a583127a54d7df23af10275f14e3ca238"} Dec 03 20:14:36.283718 master-0 kubenswrapper[29252]: I1203 20:14:36.283685 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:36.285453 master-0 kubenswrapper[29252]: I1203 20:14:36.285417 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b6f946576-zgpxr" event={"ID":"33f0cc6e-2015-4c7e-848f-ccca37ad61c4","Type":"ContainerStarted","Data":"568b90ae97cb9fa30fe2248b862ed4b9d85f7d7109a889f6a7199ab4cbf90805"} Dec 03 20:14:36.288320 master-0 kubenswrapper[29252]: I1203 20:14:36.288277 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2wbs" event={"ID":"89f08828-d22f-48a0-b247-fbe323742568","Type":"ContainerStarted","Data":"6c9d812506d3ca1b12698e8efee8179c9eb0080bab6c8fbe738f1032432decd4"} Dec 03 20:14:36.289996 master-0 kubenswrapper[29252]: I1203 20:14:36.289973 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" event={"ID":"c52974d8-fbe6-444b-97ae-468482eebac8","Type":"ContainerDied","Data":"a57372b0142961fc3eb84ca639278793b8a44eddc61b15169b7d9172b7c9d91a"} Dec 03 20:14:36.290113 master-0 kubenswrapper[29252]: I1203 20:14:36.290032 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj" Dec 03 20:14:36.293731 master-0 kubenswrapper[29252]: I1203 20:14:36.293408 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbbdc9bc8-g94br" event={"ID":"d3f987dc-c7cb-4818-a321-6b92375224a0","Type":"ContainerStarted","Data":"69f3052cc78de7d39af42d8819f14ebf2072ed3a299de11beb056fbaaefe82f0"} Dec 03 20:14:36.299453 master-0 kubenswrapper[29252]: I1203 20:14:36.299409 29252 scope.go:117] "RemoveContainer" containerID="fb26888be03097a323ec8a570f669f63df9f46f37a911bd2cc4812c68e4c8b64" Dec 03 20:14:37.284611 master-0 kubenswrapper[29252]: I1203 20:14:37.284490 29252 patch_prober.go:28] interesting pod/oauth-openshift-65d8f97447-xswx9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.90:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 20:14:37.284611 master-0 kubenswrapper[29252]: I1203 20:14:37.284550 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.90:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 20:14:37.301988 master-0 kubenswrapper[29252]: I1203 20:14:37.301920 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" event={"ID":"70f550ce-35e6-482b-a7ff-4a8c11569406","Type":"ContainerStarted","Data":"07ed5ff27ce55362de868d0925c7101538a87d3eb31ea9ff5afbf46f4ec84825"} Dec 03 20:14:37.305319 master-0 kubenswrapper[29252]: I1203 20:14:37.305252 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbbdc9bc8-g94br" event={"ID":"d3f987dc-c7cb-4818-a321-6b92375224a0","Type":"ContainerStarted","Data":"3f031dce6a1dc6f83e22c81233f26ae445eb5ba510b6d092c4d465872022408d"} Dec 03 20:14:37.642927 master-0 kubenswrapper[29252]: I1203 20:14:37.638455 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54c84f8475-9xl5s"] Dec 03 20:14:37.651325 master-0 kubenswrapper[29252]: W1203 20:14:37.651263 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod922b1203_f140_45bb_94a2_6efb31cf5ee8.slice/crio-4678fc686562b5b7415cd71a6c4953721bbb436d1673ef727e8aa43e685ce24e WatchSource:0}: Error finding container 4678fc686562b5b7415cd71a6c4953721bbb436d1673ef727e8aa43e685ce24e: Status 404 returned error can't find the container with id 4678fc686562b5b7415cd71a6c4953721bbb436d1673ef727e8aa43e685ce24e Dec 03 20:14:38.314847 master-0 kubenswrapper[29252]: I1203 20:14:38.314511 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"4678fc686562b5b7415cd71a6c4953721bbb436d1673ef727e8aa43e685ce24e"} Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.187669 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6458bb746f-d7t8z"] Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: E1203 20:14:39.190003 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerName="controller-manager" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.190032 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerName="controller-manager" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.190234 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" containerName="controller-manager" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.190866 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.195229 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-ln9n2" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.195548 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.195704 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 20:14:39.201355 master-0 kubenswrapper[29252]: I1203 20:14:39.195870 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 20:14:39.214801 master-0 kubenswrapper[29252]: I1203 20:14:39.213271 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 20:14:39.214801 master-0 kubenswrapper[29252]: I1203 20:14:39.213403 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 20:14:39.245032 master-0 kubenswrapper[29252]: I1203 20:14:39.239602 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 20:14:39.270811 master-0 kubenswrapper[29252]: I1203 20:14:39.267295 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6458bb746f-d7t8z"] Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.278894 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr"] Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.288523 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-client-ca\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.288590 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-config\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.288630 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-proxy-ca-bundles\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.288707 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2418acf-7b45-40cd-8593-0f9171f36c4e-serving-cert\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.290733 master-0 kubenswrapper[29252]: I1203 20:14:39.288735 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7bc\" (UniqueName: \"kubernetes.io/projected/f2418acf-7b45-40cd-8593-0f9171f36c4e-kube-api-access-mg7bc\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.300306 master-0 kubenswrapper[29252]: I1203 20:14:39.296917 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc"] Dec 03 20:14:39.332971 master-0 kubenswrapper[29252]: I1203 20:14:39.330029 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b6f946576-zgpxr" podStartSLOduration=5.081770315 podStartE2EDuration="10.330007466s" podCreationTimestamp="2025-12-03 20:14:29 +0000 UTC" firstStartedPulling="2025-12-03 20:14:30.121806197 +0000 UTC m=+304.935351150" lastFinishedPulling="2025-12-03 20:14:35.370043338 +0000 UTC m=+310.183588301" observedRunningTime="2025-12-03 20:14:39.270080484 +0000 UTC m=+314.083625467" watchObservedRunningTime="2025-12-03 20:14:39.330007466 +0000 UTC m=+314.143552429" Dec 03 20:14:39.346163 master-0 kubenswrapper[29252]: W1203 20:14:39.344888 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbdb41ba_89c8_430d_a4a8_5821cebbe2e0.slice/crio-4e098e917cb4e10a0bb984f5627b500895648a9a34b72ca60544bed131ae984c WatchSource:0}: Error finding container 4e098e917cb4e10a0bb984f5627b500895648a9a34b72ca60544bed131ae984c: Status 404 returned error can't find the container with id 4e098e917cb4e10a0bb984f5627b500895648a9a34b72ca60544bed131ae984c Dec 03 20:14:39.366936 master-0 kubenswrapper[29252]: I1203 20:14:39.365876 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" podStartSLOduration=4.865426289 podStartE2EDuration="10.365850526s" podCreationTimestamp="2025-12-03 20:14:29 +0000 UTC" firstStartedPulling="2025-12-03 20:14:29.84854762 +0000 UTC m=+304.662092573" lastFinishedPulling="2025-12-03 20:14:35.348971857 +0000 UTC m=+310.162516810" observedRunningTime="2025-12-03 20:14:39.328950121 +0000 UTC m=+314.142495094" watchObservedRunningTime="2025-12-03 20:14:39.365850526 +0000 UTC m=+314.179395479" Dec 03 20:14:39.385269 master-0 kubenswrapper[29252]: I1203 20:14:39.385216 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" event={"ID":"28e4a5ec-9304-475d-8321-13b21985d688","Type":"ContainerStarted","Data":"693df55c64980dfbf7720aed59dceab865996eb59b8a10f76faae37cb306b125"} Dec 03 20:14:39.390048 master-0 kubenswrapper[29252]: I1203 20:14:39.389898 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2418acf-7b45-40cd-8593-0f9171f36c4e-serving-cert\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.390048 master-0 kubenswrapper[29252]: I1203 20:14:39.389947 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7bc\" (UniqueName: \"kubernetes.io/projected/f2418acf-7b45-40cd-8593-0f9171f36c4e-kube-api-access-mg7bc\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.390048 master-0 kubenswrapper[29252]: I1203 20:14:39.390025 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-client-ca\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.390048 master-0 kubenswrapper[29252]: I1203 20:14:39.390048 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-config\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.390386 master-0 kubenswrapper[29252]: I1203 20:14:39.390071 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-proxy-ca-bundles\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.391925 master-0 kubenswrapper[29252]: I1203 20:14:39.391897 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-proxy-ca-bundles\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.399923 master-0 kubenswrapper[29252]: I1203 20:14:39.395549 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-config\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.405872 master-0 kubenswrapper[29252]: I1203 20:14:39.405306 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2418acf-7b45-40cd-8593-0f9171f36c4e-client-ca\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.437797 master-0 kubenswrapper[29252]: I1203 20:14:39.425550 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7bc\" (UniqueName: \"kubernetes.io/projected/f2418acf-7b45-40cd-8593-0f9171f36c4e-kube-api-access-mg7bc\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.437797 master-0 kubenswrapper[29252]: I1203 20:14:39.426411 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2418acf-7b45-40cd-8593-0f9171f36c4e-serving-cert\") pod \"controller-manager-6458bb746f-d7t8z\" (UID: \"f2418acf-7b45-40cd-8593-0f9171f36c4e\") " pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.471823 master-0 kubenswrapper[29252]: I1203 20:14:39.450791 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fbbdc9bc8-g94br" podStartSLOduration=8.450755285 podStartE2EDuration="8.450755285s" podCreationTimestamp="2025-12-03 20:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:39.4105684 +0000 UTC m=+314.224113373" watchObservedRunningTime="2025-12-03 20:14:39.450755285 +0000 UTC m=+314.264300238" Dec 03 20:14:39.478797 master-0 kubenswrapper[29252]: I1203 20:14:39.473141 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:14:39.478797 master-0 kubenswrapper[29252]: I1203 20:14:39.473183 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" event={"ID":"70f550ce-35e6-482b-a7ff-4a8c11569406","Type":"ContainerStarted","Data":"23822e33db6c6d056709a161eb639efbbfd374e096c2249e7fb99ce6f73c1eb1"} Dec 03 20:14:39.478797 master-0 kubenswrapper[29252]: I1203 20:14:39.473205 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:14:39.478797 master-0 kubenswrapper[29252]: I1203 20:14:39.473228 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dd7cbd76-jg7rj"] Dec 03 20:14:39.523097 master-0 kubenswrapper[29252]: I1203 20:14:39.512017 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 20:14:39.560347 master-0 kubenswrapper[29252]: I1203 20:14:39.543437 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h"] Dec 03 20:14:39.563757 master-0 kubenswrapper[29252]: I1203 20:14:39.563714 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-ff788744d-hkt6c"] Dec 03 20:14:39.564028 master-0 kubenswrapper[29252]: I1203 20:14:39.563989 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.565993 master-0 kubenswrapper[29252]: I1203 20:14:39.565958 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:39.570896 master-0 kubenswrapper[29252]: I1203 20:14:39.570860 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 03 20:14:39.571276 master-0 kubenswrapper[29252]: I1203 20:14:39.571062 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 03 20:14:39.572387 master-0 kubenswrapper[29252]: I1203 20:14:39.572361 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 03 20:14:39.575186 master-0 kubenswrapper[29252]: I1203 20:14:39.575093 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-mdqv5" Dec 03 20:14:39.575327 master-0 kubenswrapper[29252]: I1203 20:14:39.575309 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 03 20:14:39.575621 master-0 kubenswrapper[29252]: I1203 20:14:39.575594 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:14:39.576337 master-0 kubenswrapper[29252]: I1203 20:14:39.576316 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 03 20:14:39.581613 master-0 kubenswrapper[29252]: I1203 20:14:39.581340 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 03 20:14:39.603298 master-0 kubenswrapper[29252]: I1203 20:14:39.603263 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h"] Dec 03 20:14:39.622499 master-0 kubenswrapper[29252]: I1203 20:14:39.622436 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:14:39.623567 master-0 kubenswrapper[29252]: I1203 20:14:39.623545 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.643712 master-0 kubenswrapper[29252]: I1203 20:14:39.638946 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:14:39.646862 master-0 kubenswrapper[29252]: I1203 20:14:39.646826 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698643 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698723 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698753 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-serving-certs-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698770 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzr7\" (UniqueName: \"kubernetes.io/projected/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-kube-api-access-qzzr7\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698803 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698827 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-federate-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698854 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698877 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698900 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698918 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698935 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-metrics-client-ca\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698955 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698977 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.698998 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpkb\" (UniqueName: \"kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.699890 master-0 kubenswrapper[29252]: I1203 20:14:39.699027 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.799804 master-0 kubenswrapper[29252]: I1203 20:14:39.799751 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799815 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799846 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-serving-certs-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799867 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzr7\" (UniqueName: \"kubernetes.io/projected/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-kube-api-access-qzzr7\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799887 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799904 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-federate-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799929 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799950 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.799976 master-0 kubenswrapper[29252]: I1203 20:14:39.799970 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.799990 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-metrics-client-ca\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.800003 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.800039 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.800060 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.800078 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpkb\" (UniqueName: \"kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.800201 master-0 kubenswrapper[29252]: I1203 20:14:39.800114 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.800661 master-0 kubenswrapper[29252]: I1203 20:14:39.800632 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.800713 master-0 kubenswrapper[29252]: I1203 20:14:39.800692 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.801199 master-0 kubenswrapper[29252]: I1203 20:14:39.801166 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.802393 master-0 kubenswrapper[29252]: I1203 20:14:39.801968 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.803461 master-0 kubenswrapper[29252]: I1203 20:14:39.802484 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-serving-certs-ca-bundle\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.803461 master-0 kubenswrapper[29252]: I1203 20:14:39.802685 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-metrics-client-ca\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.803461 master-0 kubenswrapper[29252]: I1203 20:14:39.803217 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.804050 master-0 kubenswrapper[29252]: I1203 20:14:39.803995 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.804406 master-0 kubenswrapper[29252]: I1203 20:14:39.804369 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.805818 master-0 kubenswrapper[29252]: I1203 20:14:39.805763 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-telemeter-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.805900 master-0 kubenswrapper[29252]: I1203 20:14:39.805862 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.806670 master-0 kubenswrapper[29252]: I1203 20:14:39.806652 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-secret-telemeter-client\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.808116 master-0 kubenswrapper[29252]: I1203 20:14:39.808090 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-federate-client-tls\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.846771 master-0 kubenswrapper[29252]: I1203 20:14:39.846728 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpkb\" (UniqueName: \"kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb\") pod \"console-5b5d787587-g9t7c\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:39.851433 master-0 kubenswrapper[29252]: I1203 20:14:39.851401 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzr7\" (UniqueName: \"kubernetes.io/projected/00d60e6a-6ad3-4109-bb1e-30e656b91dc9-kube-api-access-qzzr7\") pod \"telemeter-client-5c6d5cb75d-gcw4h\" (UID: \"00d60e6a-6ad3-4109-bb1e-30e656b91dc9\") " pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.915178 master-0 kubenswrapper[29252]: I1203 20:14:39.914743 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" Dec 03 20:14:39.959206 master-0 kubenswrapper[29252]: I1203 20:14:39.959104 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:40.186296 master-0 kubenswrapper[29252]: I1203 20:14:40.185435 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6458bb746f-d7t8z"] Dec 03 20:14:40.186616 master-0 kubenswrapper[29252]: W1203 20:14:40.186569 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2418acf_7b45_40cd_8593_0f9171f36c4e.slice/crio-fcaf47a0f095b5fd3bf8aad6fae63ca149862127a258ff7c73527597a2899609 WatchSource:0}: Error finding container fcaf47a0f095b5fd3bf8aad6fae63ca149862127a258ff7c73527597a2899609: Status 404 returned error can't find the container with id fcaf47a0f095b5fd3bf8aad6fae63ca149862127a258ff7c73527597a2899609 Dec 03 20:14:40.307517 master-0 kubenswrapper[29252]: I1203 20:14:40.304497 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6b8786b56c-g7dqt"] Dec 03 20:14:40.307517 master-0 kubenswrapper[29252]: I1203 20:14:40.305688 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.308046 master-0 kubenswrapper[29252]: I1203 20:14:40.308020 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xwq9l" Dec 03 20:14:40.308439 master-0 kubenswrapper[29252]: I1203 20:14:40.308421 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-fekh162m2nm7j" Dec 03 20:14:40.308563 master-0 kubenswrapper[29252]: I1203 20:14:40.308528 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 20:14:40.308699 master-0 kubenswrapper[29252]: I1203 20:14:40.308468 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 20:14:40.308699 master-0 kubenswrapper[29252]: I1203 20:14:40.308680 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 20:14:40.312186 master-0 kubenswrapper[29252]: I1203 20:14:40.308714 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 20:14:40.326444 master-0 kubenswrapper[29252]: I1203 20:14:40.326224 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b8786b56c-g7dqt"] Dec 03 20:14:40.407735 master-0 kubenswrapper[29252]: I1203 20:14:40.407654 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.407735 master-0 kubenswrapper[29252]: I1203 20:14:40.407706 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-audit-log\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.407735 master-0 kubenswrapper[29252]: I1203 20:14:40.407725 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-metrics-server-audit-profiles\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.407735 master-0 kubenswrapper[29252]: I1203 20:14:40.407751 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-server-tls\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.408394 master-0 kubenswrapper[29252]: I1203 20:14:40.407854 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-client-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.408394 master-0 kubenswrapper[29252]: I1203 20:14:40.407920 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrgf\" (UniqueName: \"kubernetes.io/projected/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-kube-api-access-swrgf\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.408394 master-0 kubenswrapper[29252]: I1203 20:14:40.407945 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-client-certs\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.435206 master-0 kubenswrapper[29252]: I1203 20:14:40.433968 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf"] Dec 03 20:14:40.435496 master-0 kubenswrapper[29252]: I1203 20:14:40.435292 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:40.438882 master-0 kubenswrapper[29252]: I1203 20:14:40.437583 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-x7648" Dec 03 20:14:40.440231 master-0 kubenswrapper[29252]: I1203 20:14:40.439240 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 03 20:14:40.440973 master-0 kubenswrapper[29252]: I1203 20:14:40.440944 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf"] Dec 03 20:14:40.465909 master-0 kubenswrapper[29252]: I1203 20:14:40.464398 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" event={"ID":"f2418acf-7b45-40cd-8593-0f9171f36c4e","Type":"ContainerStarted","Data":"e5764cc9f73631bc3a331134e376d4f47666afc80876fec4ff014de80ca3a1f0"} Dec 03 20:14:40.465909 master-0 kubenswrapper[29252]: I1203 20:14:40.464453 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" event={"ID":"f2418acf-7b45-40cd-8593-0f9171f36c4e","Type":"ContainerStarted","Data":"fcaf47a0f095b5fd3bf8aad6fae63ca149862127a258ff7c73527597a2899609"} Dec 03 20:14:40.465909 master-0 kubenswrapper[29252]: I1203 20:14:40.465431 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:40.469798 master-0 kubenswrapper[29252]: I1203 20:14:40.469586 29252 generic.go:334] "Generic (PLEG): container finished" podID="89f08828-d22f-48a0-b247-fbe323742568" containerID="96f04f637ab2be18cc89919d6b3cd46885861ebb1f96d33df9005758850ed357" exitCode=0 Dec 03 20:14:40.469798 master-0 kubenswrapper[29252]: I1203 20:14:40.469663 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2wbs" event={"ID":"89f08828-d22f-48a0-b247-fbe323742568","Type":"ContainerDied","Data":"96f04f637ab2be18cc89919d6b3cd46885861ebb1f96d33df9005758850ed357"} Dec 03 20:14:40.473712 master-0 kubenswrapper[29252]: I1203 20:14:40.472295 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" Dec 03 20:14:40.473712 master-0 kubenswrapper[29252]: I1203 20:14:40.472349 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" event={"ID":"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0","Type":"ContainerStarted","Data":"3d49e264f0ee3ffe5ec99841afdf88a765ae9e8c1de0bef1a8f82d7b01802d13"} Dec 03 20:14:40.473712 master-0 kubenswrapper[29252]: I1203 20:14:40.472365 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" event={"ID":"bbdb41ba-89c8-430d-a4a8-5821cebbe2e0","Type":"ContainerStarted","Data":"4e098e917cb4e10a0bb984f5627b500895648a9a34b72ca60544bed131ae984c"} Dec 03 20:14:40.473712 master-0 kubenswrapper[29252]: I1203 20:14:40.472603 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:40.478626 master-0 kubenswrapper[29252]: I1203 20:14:40.477620 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" Dec 03 20:14:40.494533 master-0 kubenswrapper[29252]: I1203 20:14:40.494412 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6458bb746f-d7t8z" podStartSLOduration=7.494389837 podStartE2EDuration="7.494389837s" podCreationTimestamp="2025-12-03 20:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:40.487358766 +0000 UTC m=+315.300903739" watchObservedRunningTime="2025-12-03 20:14:40.494389837 +0000 UTC m=+315.307934800" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509220 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrgf\" (UniqueName: \"kubernetes.io/projected/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-kube-api-access-swrgf\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509285 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-client-certs\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509415 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a1955e8-1b5a-40b5-b251-d22d715f0e0b-monitoring-plugin-cert\") pod \"monitoring-plugin-78df8f7475-2lnwf\" (UID: \"5a1955e8-1b5a-40b5-b251-d22d715f0e0b\") " pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509454 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509477 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-audit-log\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509499 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-metrics-server-audit-profiles\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509549 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-server-tls\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.511399 master-0 kubenswrapper[29252]: I1203 20:14:40.509678 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-client-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.514761 master-0 kubenswrapper[29252]: I1203 20:14:40.514719 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h"] Dec 03 20:14:40.515792 master-0 kubenswrapper[29252]: I1203 20:14:40.515743 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-client-certs\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.518411 master-0 kubenswrapper[29252]: I1203 20:14:40.516794 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-audit-log\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.518411 master-0 kubenswrapper[29252]: I1203 20:14:40.517324 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-metrics-server-audit-profiles\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.518411 master-0 kubenswrapper[29252]: I1203 20:14:40.517917 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.522397 master-0 kubenswrapper[29252]: I1203 20:14:40.521817 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-secret-metrics-server-tls\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.522959 master-0 kubenswrapper[29252]: I1203 20:14:40.522818 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-client-ca-bundle\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.555855 master-0 kubenswrapper[29252]: I1203 20:14:40.552051 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrgf\" (UniqueName: \"kubernetes.io/projected/9c6bc36a-ed58-4b4d-b602-14ff2d86e266-kube-api-access-swrgf\") pod \"metrics-server-6b8786b56c-g7dqt\" (UID: \"9c6bc36a-ed58-4b4d-b602-14ff2d86e266\") " pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.600962 master-0 kubenswrapper[29252]: I1203 20:14:40.600905 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:14:40.605286 master-0 kubenswrapper[29252]: I1203 20:14:40.605218 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f66884f7c-299nc" podStartSLOduration=6.605180194 podStartE2EDuration="6.605180194s" podCreationTimestamp="2025-12-03 20:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:40.589547775 +0000 UTC m=+315.403092738" watchObservedRunningTime="2025-12-03 20:14:40.605180194 +0000 UTC m=+315.418725147" Dec 03 20:14:40.614842 master-0 kubenswrapper[29252]: I1203 20:14:40.614704 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a1955e8-1b5a-40b5-b251-d22d715f0e0b-monitoring-plugin-cert\") pod \"monitoring-plugin-78df8f7475-2lnwf\" (UID: \"5a1955e8-1b5a-40b5-b251-d22d715f0e0b\") " pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:40.624443 master-0 kubenswrapper[29252]: I1203 20:14:40.624387 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5a1955e8-1b5a-40b5-b251-d22d715f0e0b-monitoring-plugin-cert\") pod \"monitoring-plugin-78df8f7475-2lnwf\" (UID: \"5a1955e8-1b5a-40b5-b251-d22d715f0e0b\") " pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:40.668553 master-0 kubenswrapper[29252]: I1203 20:14:40.665999 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:14:40.745673 master-0 kubenswrapper[29252]: I1203 20:14:40.745622 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.748138 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.751140 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4vlhrftdrf07t" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.751580 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.751981 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.752084 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.752203 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-s7kpg" Dec 03 20:14:40.752529 master-0 kubenswrapper[29252]: I1203 20:14:40.752096 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 20:14:40.755948 master-0 kubenswrapper[29252]: I1203 20:14:40.753759 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 20:14:40.755948 master-0 kubenswrapper[29252]: I1203 20:14:40.753857 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 20:14:40.755948 master-0 kubenswrapper[29252]: I1203 20:14:40.753960 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 20:14:40.755948 master-0 kubenswrapper[29252]: I1203 20:14:40.754076 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 20:14:40.755948 master-0 kubenswrapper[29252]: I1203 20:14:40.755692 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 20:14:40.758156 master-0 kubenswrapper[29252]: I1203 20:14:40.758121 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:40.766405 master-0 kubenswrapper[29252]: I1203 20:14:40.766322 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 20:14:40.773527 master-0 kubenswrapper[29252]: I1203 20:14:40.773414 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:14:40.785317 master-0 kubenswrapper[29252]: I1203 20:14:40.785268 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.818878 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.818963 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.818994 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819043 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819070 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819096 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819145 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819174 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xnq\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819202 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819238 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819264 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819301 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819323 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819346 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819373 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819406 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819440 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.819466 master-0 kubenswrapper[29252]: I1203 20:14:40.819464 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922060 master-0 kubenswrapper[29252]: I1203 20:14:40.921946 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xnq\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922060 master-0 kubenswrapper[29252]: I1203 20:14:40.922051 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922098 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922125 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922156 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922175 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922199 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922228 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922254 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922278 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922298 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922344 master-0 kubenswrapper[29252]: I1203 20:14:40.922338 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922375 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922399 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922441 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922469 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922493 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.922842 master-0 kubenswrapper[29252]: I1203 20:14:40.922530 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.925167 master-0 kubenswrapper[29252]: I1203 20:14:40.924280 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.925167 master-0 kubenswrapper[29252]: I1203 20:14:40.924709 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.926364 master-0 kubenswrapper[29252]: I1203 20:14:40.925849 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.926933 master-0 kubenswrapper[29252]: I1203 20:14:40.926695 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.926933 master-0 kubenswrapper[29252]: I1203 20:14:40.926891 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.927522 master-0 kubenswrapper[29252]: I1203 20:14:40.927478 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.940176 master-0 kubenswrapper[29252]: I1203 20:14:40.939925 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.942913 master-0 kubenswrapper[29252]: I1203 20:14:40.942835 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.942913 master-0 kubenswrapper[29252]: I1203 20:14:40.942901 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.943090 master-0 kubenswrapper[29252]: I1203 20:14:40.942933 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.943152 master-0 kubenswrapper[29252]: I1203 20:14:40.943106 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.943387 master-0 kubenswrapper[29252]: I1203 20:14:40.943337 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.943682 master-0 kubenswrapper[29252]: I1203 20:14:40.943623 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.945825 master-0 kubenswrapper[29252]: I1203 20:14:40.945318 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.945825 master-0 kubenswrapper[29252]: I1203 20:14:40.945788 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.946171 master-0 kubenswrapper[29252]: I1203 20:14:40.946133 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.946692 master-0 kubenswrapper[29252]: I1203 20:14:40.946617 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xnq\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:40.949652 master-0 kubenswrapper[29252]: I1203 20:14:40.949595 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:41.088382 master-0 kubenswrapper[29252]: I1203 20:14:41.087257 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:14:41.304614 master-0 kubenswrapper[29252]: W1203 20:14:41.304549 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6426848a_3e1d_4988_9749_5e7fc2620e51.slice/crio-3d2b120dbe3c83a9d3ee67340b719b29c2f99528319c49674650d81821263c36 WatchSource:0}: Error finding container 3d2b120dbe3c83a9d3ee67340b719b29c2f99528319c49674650d81821263c36: Status 404 returned error can't find the container with id 3d2b120dbe3c83a9d3ee67340b719b29c2f99528319c49674650d81821263c36 Dec 03 20:14:41.428143 master-0 kubenswrapper[29252]: I1203 20:14:41.426224 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c22cb59-5083-4be6-9998-a9e67a2c20cd" path="/var/lib/kubelet/pods/1c22cb59-5083-4be6-9998-a9e67a2c20cd/volumes" Dec 03 20:14:41.428143 master-0 kubenswrapper[29252]: I1203 20:14:41.426876 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52974d8-fbe6-444b-97ae-468482eebac8" path="/var/lib/kubelet/pods/c52974d8-fbe6-444b-97ae-468482eebac8/volumes" Dec 03 20:14:41.506300 master-0 kubenswrapper[29252]: I1203 20:14:41.492001 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5d787587-g9t7c" event={"ID":"6426848a-3e1d-4988-9749-5e7fc2620e51","Type":"ContainerStarted","Data":"3d2b120dbe3c83a9d3ee67340b719b29c2f99528319c49674650d81821263c36"} Dec 03 20:14:41.875833 master-0 kubenswrapper[29252]: W1203 20:14:41.875766 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d60e6a_6ad3_4109_bb1e_30e656b91dc9.slice/crio-5388b2db9c0caeedfb40292ffae6ebd67146dbd7adde470c7094d42084b94c3c WatchSource:0}: Error finding container 5388b2db9c0caeedfb40292ffae6ebd67146dbd7adde470c7094d42084b94c3c: Status 404 returned error can't find the container with id 5388b2db9c0caeedfb40292ffae6ebd67146dbd7adde470c7094d42084b94c3c Dec 03 20:14:42.257569 master-0 kubenswrapper[29252]: I1203 20:14:42.257327 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:42.257569 master-0 kubenswrapper[29252]: I1203 20:14:42.257390 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:14:42.261169 master-0 kubenswrapper[29252]: I1203 20:14:42.261129 29252 patch_prober.go:28] interesting pod/console-6fbbdc9bc8-g94br container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 03 20:14:42.261292 master-0 kubenswrapper[29252]: I1203 20:14:42.261197 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fbbdc9bc8-g94br" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 03 20:14:42.509518 master-0 kubenswrapper[29252]: I1203 20:14:42.509402 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" event={"ID":"00d60e6a-6ad3-4109-bb1e-30e656b91dc9","Type":"ContainerStarted","Data":"5388b2db9c0caeedfb40292ffae6ebd67146dbd7adde470c7094d42084b94c3c"} Dec 03 20:14:44.189881 master-0 kubenswrapper[29252]: I1203 20:14:44.189717 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf"] Dec 03 20:14:44.299841 master-0 kubenswrapper[29252]: I1203 20:14:44.297437 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:14:44.314007 master-0 kubenswrapper[29252]: W1203 20:14:44.313956 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3560529_2f6a_4193_b606_18474b120488.slice/crio-d51105b3dc08f7680d5b29f7a5ad44e8c57e73d108687231c5eb70618423681a WatchSource:0}: Error finding container d51105b3dc08f7680d5b29f7a5ad44e8c57e73d108687231c5eb70618423681a: Status 404 returned error can't find the container with id d51105b3dc08f7680d5b29f7a5ad44e8c57e73d108687231c5eb70618423681a Dec 03 20:14:44.323765 master-0 kubenswrapper[29252]: I1203 20:14:44.323718 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b8786b56c-g7dqt"] Dec 03 20:14:44.527879 master-0 kubenswrapper[29252]: I1203 20:14:44.527828 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" event={"ID":"9c6bc36a-ed58-4b4d-b602-14ff2d86e266","Type":"ContainerStarted","Data":"b0de573db8d08d3777b6f0ea697a92b1f1f0a4a7a70fdee62818e0875c96832c"} Dec 03 20:14:44.532471 master-0 kubenswrapper[29252]: I1203 20:14:44.531529 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" event={"ID":"70f550ce-35e6-482b-a7ff-4a8c11569406","Type":"ContainerStarted","Data":"1c09a245d90998d192f6f4413f806c0fb71706a37761e4737282fe9fe0aad9fb"} Dec 03 20:14:44.541111 master-0 kubenswrapper[29252]: I1203 20:14:44.541050 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="2db45bdca5ac382650099e41ec380f87182a50ddf6fab9295a34b23e8201c999" exitCode=0 Dec 03 20:14:44.541552 master-0 kubenswrapper[29252]: I1203 20:14:44.541528 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"2db45bdca5ac382650099e41ec380f87182a50ddf6fab9295a34b23e8201c999"} Dec 03 20:14:44.545731 master-0 kubenswrapper[29252]: I1203 20:14:44.545675 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5d787587-g9t7c" event={"ID":"6426848a-3e1d-4988-9749-5e7fc2620e51","Type":"ContainerStarted","Data":"3a9b6e3578080fa1e5c782639b46b837e227649335bdd9698dc3fcff6bb5a882"} Dec 03 20:14:44.547032 master-0 kubenswrapper[29252]: I1203 20:14:44.546986 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" event={"ID":"5a1955e8-1b5a-40b5-b251-d22d715f0e0b","Type":"ContainerStarted","Data":"017f76dcc7f1dc602484aab0f96dc1b298a4d8613cec8aeaf20c5fb0da1ede23"} Dec 03 20:14:44.549644 master-0 kubenswrapper[29252]: I1203 20:14:44.549608 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2wbs" event={"ID":"89f08828-d22f-48a0-b247-fbe323742568","Type":"ContainerStarted","Data":"dfb161133122b809471688ff13d0e449041c61d4cdc19dcbc56588bda097bf6c"} Dec 03 20:14:44.549766 master-0 kubenswrapper[29252]: I1203 20:14:44.549748 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2wbs" event={"ID":"89f08828-d22f-48a0-b247-fbe323742568","Type":"ContainerStarted","Data":"0180e46cde4e0d90a629b936c70b9db2f3981c9bb0e068ed8ca7403559d646fd"} Dec 03 20:14:44.564170 master-0 kubenswrapper[29252]: I1203 20:14:44.564061 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" event={"ID":"28e4a5ec-9304-475d-8321-13b21985d688","Type":"ContainerStarted","Data":"89249d8fe22ae961e8ee56fe25eafa8bc17df1a13b57bbfa4f3066bcac7103c4"} Dec 03 20:14:44.564406 master-0 kubenswrapper[29252]: I1203 20:14:44.564388 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" event={"ID":"28e4a5ec-9304-475d-8321-13b21985d688","Type":"ContainerStarted","Data":"ed146d0277f1260ae22ebdda7168f8c680d5c278d7cb32bb8cafd6d33ef7a709"} Dec 03 20:14:44.570370 master-0 kubenswrapper[29252]: I1203 20:14:44.570323 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"d51105b3dc08f7680d5b29f7a5ad44e8c57e73d108687231c5eb70618423681a"} Dec 03 20:14:44.587300 master-0 kubenswrapper[29252]: I1203 20:14:44.587170 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"620d27b866291b303e8fe8f877aae3bb79927ef1a4b099be0091bc900cf176b1"} Dec 03 20:14:44.587300 master-0 kubenswrapper[29252]: I1203 20:14:44.587273 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"76a6ec62195ac4ee7dd55c2419b97e285fa8c659d0d619d39163f4c5ccd5c40c"} Dec 03 20:14:44.594302 master-0 kubenswrapper[29252]: I1203 20:14:44.594186 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-4j26k" podStartSLOduration=8.162162449 podStartE2EDuration="12.594142686s" podCreationTimestamp="2025-12-03 20:14:32 +0000 UTC" firstStartedPulling="2025-12-03 20:14:39.231457727 +0000 UTC m=+314.045002690" lastFinishedPulling="2025-12-03 20:14:43.663437964 +0000 UTC m=+318.476982927" observedRunningTime="2025-12-03 20:14:44.565564053 +0000 UTC m=+319.379109016" watchObservedRunningTime="2025-12-03 20:14:44.594142686 +0000 UTC m=+319.407687639" Dec 03 20:14:44.595480 master-0 kubenswrapper[29252]: I1203 20:14:44.595440 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b5d787587-g9t7c" podStartSLOduration=5.595434448 podStartE2EDuration="5.595434448s" podCreationTimestamp="2025-12-03 20:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:44.590442717 +0000 UTC m=+319.403987680" watchObservedRunningTime="2025-12-03 20:14:44.595434448 +0000 UTC m=+319.408979401" Dec 03 20:14:44.628630 master-0 kubenswrapper[29252]: I1203 20:14:44.628538 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-x2wbs" podStartSLOduration=8.686038976 podStartE2EDuration="12.62850884s" podCreationTimestamp="2025-12-03 20:14:32 +0000 UTC" firstStartedPulling="2025-12-03 20:14:35.406614155 +0000 UTC m=+310.220159108" lastFinishedPulling="2025-12-03 20:14:39.349084019 +0000 UTC m=+314.162628972" observedRunningTime="2025-12-03 20:14:44.611622101 +0000 UTC m=+319.425167074" watchObservedRunningTime="2025-12-03 20:14:44.62850884 +0000 UTC m=+319.442053803" Dec 03 20:14:45.090994 master-0 kubenswrapper[29252]: I1203 20:14:45.090917 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" podStartSLOduration=8.774889639 podStartE2EDuration="13.090898814s" podCreationTimestamp="2025-12-03 20:14:32 +0000 UTC" firstStartedPulling="2025-12-03 20:14:39.345107213 +0000 UTC m=+314.158652176" lastFinishedPulling="2025-12-03 20:14:43.661116398 +0000 UTC m=+318.474661351" observedRunningTime="2025-12-03 20:14:45.084638491 +0000 UTC m=+319.898183464" watchObservedRunningTime="2025-12-03 20:14:45.090898814 +0000 UTC m=+319.904443767" Dec 03 20:14:45.602596 master-0 kubenswrapper[29252]: I1203 20:14:45.602113 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-wj8nr" event={"ID":"28e4a5ec-9304-475d-8321-13b21985d688","Type":"ContainerStarted","Data":"de676633e1bdeed49e82c9a4b2e1968ad63612968c12da62c147930bb083f010"} Dec 03 20:14:45.605513 master-0 kubenswrapper[29252]: I1203 20:14:45.605373 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="78c0709728fd6de20a4742e9ccb3c8eafe7bbbc9e3f8cb53aeaf2205b70b7762" exitCode=0 Dec 03 20:14:45.605513 master-0 kubenswrapper[29252]: I1203 20:14:45.605461 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"78c0709728fd6de20a4742e9ccb3c8eafe7bbbc9e3f8cb53aeaf2205b70b7762"} Dec 03 20:14:45.609178 master-0 kubenswrapper[29252]: I1203 20:14:45.609131 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"db0e5453fdc3ce6bdc8653d8096d970d0844b61b8c892cc10477636883e6e22a"} Dec 03 20:14:46.845896 master-0 kubenswrapper[29252]: I1203 20:14:46.845819 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-nvqxt"] Dec 03 20:14:46.848375 master-0 kubenswrapper[29252]: I1203 20:14:46.848345 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:46.854678 master-0 kubenswrapper[29252]: I1203 20:14:46.853250 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-nvqxt"] Dec 03 20:14:46.854678 master-0 kubenswrapper[29252]: I1203 20:14:46.853395 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 20:14:46.854678 master-0 kubenswrapper[29252]: I1203 20:14:46.853705 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 20:14:46.976255 master-0 kubenswrapper[29252]: I1203 20:14:46.976146 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:46.976563 master-0 kubenswrapper[29252]: I1203 20:14:46.976270 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0590f508-0371-4369-a9f7-ee4ef5acbcac-nginx-conf\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.078127 master-0 kubenswrapper[29252]: I1203 20:14:47.077923 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.078127 master-0 kubenswrapper[29252]: I1203 20:14:47.077998 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0590f508-0371-4369-a9f7-ee4ef5acbcac-nginx-conf\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.078127 master-0 kubenswrapper[29252]: E1203 20:14:47.078088 29252 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Dec 03 20:14:47.078501 master-0 kubenswrapper[29252]: E1203 20:14:47.078169 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert podName:0590f508-0371-4369-a9f7-ee4ef5acbcac nodeName:}" failed. No retries permitted until 2025-12-03 20:14:47.578143709 +0000 UTC m=+322.391688652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert") pod "networking-console-plugin-7c696657b7-nvqxt" (UID: "0590f508-0371-4369-a9f7-ee4ef5acbcac") : secret "networking-console-plugin-cert" not found Dec 03 20:14:47.079051 master-0 kubenswrapper[29252]: I1203 20:14:47.079008 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0590f508-0371-4369-a9f7-ee4ef5acbcac-nginx-conf\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.304253 master-0 kubenswrapper[29252]: I1203 20:14:47.304193 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:14:47.329354 master-0 kubenswrapper[29252]: I1203 20:14:47.329302 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:14:47.330455 master-0 kubenswrapper[29252]: I1203 20:14:47.330395 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.345169 master-0 kubenswrapper[29252]: I1203 20:14:47.345113 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:14:47.485582 master-0 kubenswrapper[29252]: I1203 20:14:47.485472 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.485954 master-0 kubenswrapper[29252]: I1203 20:14:47.485843 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.485954 master-0 kubenswrapper[29252]: I1203 20:14:47.485908 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.486191 master-0 kubenswrapper[29252]: I1203 20:14:47.486144 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.486373 master-0 kubenswrapper[29252]: I1203 20:14:47.486307 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hs6x\" (UniqueName: \"kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.486503 master-0 kubenswrapper[29252]: I1203 20:14:47.486452 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.486503 master-0 kubenswrapper[29252]: I1203 20:14:47.486478 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587607 master-0 kubenswrapper[29252]: I1203 20:14:47.587487 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587607 master-0 kubenswrapper[29252]: I1203 20:14:47.587547 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587607 master-0 kubenswrapper[29252]: I1203 20:14:47.587568 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587880 master-0 kubenswrapper[29252]: I1203 20:14:47.587620 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587880 master-0 kubenswrapper[29252]: I1203 20:14:47.587678 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.587880 master-0 kubenswrapper[29252]: I1203 20:14:47.587697 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hs6x\" (UniqueName: \"kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587880 master-0 kubenswrapper[29252]: I1203 20:14:47.587722 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.587880 master-0 kubenswrapper[29252]: I1203 20:14:47.587743 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.588887 master-0 kubenswrapper[29252]: I1203 20:14:47.588842 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.588966 master-0 kubenswrapper[29252]: I1203 20:14:47.588894 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.588966 master-0 kubenswrapper[29252]: I1203 20:14:47.588900 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.590401 master-0 kubenswrapper[29252]: I1203 20:14:47.590375 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.600600 master-0 kubenswrapper[29252]: I1203 20:14:47.600563 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.600848 master-0 kubenswrapper[29252]: I1203 20:14:47.600811 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.602717 master-0 kubenswrapper[29252]: I1203 20:14:47.602671 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hs6x\" (UniqueName: \"kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x\") pod \"console-65c74dc56f-mlqjw\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.603307 master-0 kubenswrapper[29252]: I1203 20:14:47.603264 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0590f508-0371-4369-a9f7-ee4ef5acbcac-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-nvqxt\" (UID: \"0590f508-0371-4369-a9f7-ee4ef5acbcac\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:47.655759 master-0 kubenswrapper[29252]: I1203 20:14:47.655684 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:47.820854 master-0 kubenswrapper[29252]: I1203 20:14:47.820798 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" Dec 03 20:14:49.959638 master-0 kubenswrapper[29252]: I1203 20:14:49.959549 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:49.961290 master-0 kubenswrapper[29252]: I1203 20:14:49.961229 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:14:49.962602 master-0 kubenswrapper[29252]: I1203 20:14:49.962559 29252 patch_prober.go:28] interesting pod/console-5b5d787587-g9t7c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.100:8443/health\": dial tcp 10.128.0.100:8443: connect: connection refused" start-of-body= Dec 03 20:14:49.962660 master-0 kubenswrapper[29252]: I1203 20:14:49.962618 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5b5d787587-g9t7c" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" probeResult="failure" output="Get \"https://10.128.0.100:8443/health\": dial tcp 10.128.0.100:8443: connect: connection refused" Dec 03 20:14:50.573735 master-0 kubenswrapper[29252]: I1203 20:14:50.573672 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-nvqxt"] Dec 03 20:14:50.654179 master-0 kubenswrapper[29252]: I1203 20:14:50.652099 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" event={"ID":"9c6bc36a-ed58-4b4d-b602-14ff2d86e266","Type":"ContainerStarted","Data":"8cedf9240f21aa4a76b4bcd4807bbd9e0076bc3d73a2fffcf882406270a545b7"} Dec 03 20:14:50.654179 master-0 kubenswrapper[29252]: I1203 20:14:50.653460 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" event={"ID":"0590f508-0371-4369-a9f7-ee4ef5acbcac","Type":"ContainerStarted","Data":"01d9e6ce8f7ce702483276633761faf00ce14d30ceb4b3a0d555ed46cefed25d"} Dec 03 20:14:50.656252 master-0 kubenswrapper[29252]: I1203 20:14:50.656162 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" event={"ID":"5a1955e8-1b5a-40b5-b251-d22d715f0e0b","Type":"ContainerStarted","Data":"34e9c93db30a7eecfff53d4c64389097bd7c3376ac2b19f24b61741b7991845f"} Dec 03 20:14:50.658056 master-0 kubenswrapper[29252]: I1203 20:14:50.658033 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:50.659127 master-0 kubenswrapper[29252]: I1203 20:14:50.659103 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" event={"ID":"00d60e6a-6ad3-4109-bb1e-30e656b91dc9","Type":"ContainerStarted","Data":"c584e8e13cd769a898a524b8485eae95d814f42a090222c56602450fdb202eb5"} Dec 03 20:14:50.663198 master-0 kubenswrapper[29252]: I1203 20:14:50.663158 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" Dec 03 20:14:50.672381 master-0 kubenswrapper[29252]: I1203 20:14:50.672332 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"0976a89f464ebb972c93a46088e9eeb54bd3bcf4771fafb2ab2a84f679391cfb"} Dec 03 20:14:50.674154 master-0 kubenswrapper[29252]: I1203 20:14:50.674115 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad"} Dec 03 20:14:50.685859 master-0 kubenswrapper[29252]: I1203 20:14:50.685000 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"eb88b2f6c316771a42528245cece63ef8c0f1f68a804e16767a1e0a213ceca20"} Dec 03 20:14:50.687481 master-0 kubenswrapper[29252]: I1203 20:14:50.687400 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" podStartSLOduration=4.904598855 podStartE2EDuration="10.687379161s" podCreationTimestamp="2025-12-03 20:14:40 +0000 UTC" firstStartedPulling="2025-12-03 20:14:44.340280479 +0000 UTC m=+319.153825432" lastFinishedPulling="2025-12-03 20:14:50.123060785 +0000 UTC m=+324.936605738" observedRunningTime="2025-12-03 20:14:50.682916893 +0000 UTC m=+325.496461846" watchObservedRunningTime="2025-12-03 20:14:50.687379161 +0000 UTC m=+325.500924114" Dec 03 20:14:50.714830 master-0 kubenswrapper[29252]: I1203 20:14:50.714408 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-78df8f7475-2lnwf" podStartSLOduration=4.8721597580000005 podStartE2EDuration="10.714389036s" podCreationTimestamp="2025-12-03 20:14:40 +0000 UTC" firstStartedPulling="2025-12-03 20:14:44.253542146 +0000 UTC m=+319.067087099" lastFinishedPulling="2025-12-03 20:14:50.095771424 +0000 UTC m=+324.909316377" observedRunningTime="2025-12-03 20:14:50.712732236 +0000 UTC m=+325.526277199" watchObservedRunningTime="2025-12-03 20:14:50.714389036 +0000 UTC m=+325.527933989" Dec 03 20:14:50.750491 master-0 kubenswrapper[29252]: W1203 20:14:50.750446 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13ed7cc_6322_4676_88fc_363cff00f509.slice/crio-b64169813e76f27d5b81af1f242ceb058bc9e3bf5600cb7456188abdcedde17d WatchSource:0}: Error finding container b64169813e76f27d5b81af1f242ceb058bc9e3bf5600cb7456188abdcedde17d: Status 404 returned error can't find the container with id b64169813e76f27d5b81af1f242ceb058bc9e3bf5600cb7456188abdcedde17d Dec 03 20:14:50.753330 master-0 kubenswrapper[29252]: I1203 20:14:50.753291 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:14:51.032805 master-0 kubenswrapper[29252]: I1203 20:14:51.022459 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 20:14:51.032805 master-0 kubenswrapper[29252]: I1203 20:14:51.023251 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.032805 master-0 kubenswrapper[29252]: I1203 20:14:51.026223 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-n2brl" Dec 03 20:14:51.037409 master-0 kubenswrapper[29252]: I1203 20:14:51.035726 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 20:14:51.071824 master-0 kubenswrapper[29252]: I1203 20:14:51.063624 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 20:14:51.160673 master-0 kubenswrapper[29252]: I1203 20:14:51.160619 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.160764 master-0 kubenswrapper[29252]: I1203 20:14:51.160685 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.160764 master-0 kubenswrapper[29252]: I1203 20:14:51.160720 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.264840 master-0 kubenswrapper[29252]: I1203 20:14:51.264788 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.264975 master-0 kubenswrapper[29252]: I1203 20:14:51.264933 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.264975 master-0 kubenswrapper[29252]: I1203 20:14:51.264963 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.265097 master-0 kubenswrapper[29252]: I1203 20:14:51.265058 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.265149 master-0 kubenswrapper[29252]: I1203 20:14:51.265103 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.285182 master-0 kubenswrapper[29252]: I1203 20:14:51.285048 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access\") pod \"installer-4-master-0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.359227 master-0 kubenswrapper[29252]: I1203 20:14:51.359162 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:14:51.698948 master-0 kubenswrapper[29252]: I1203 20:14:51.698885 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c74dc56f-mlqjw" event={"ID":"e13ed7cc-6322-4676-88fc-363cff00f509","Type":"ContainerStarted","Data":"f54a06368dd236747f03ccfb28200dab1a76dccafc53a58a78b24d448650bca8"} Dec 03 20:14:51.698948 master-0 kubenswrapper[29252]: I1203 20:14:51.698937 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c74dc56f-mlqjw" event={"ID":"e13ed7cc-6322-4676-88fc-363cff00f509","Type":"ContainerStarted","Data":"b64169813e76f27d5b81af1f242ceb058bc9e3bf5600cb7456188abdcedde17d"} Dec 03 20:14:51.706841 master-0 kubenswrapper[29252]: I1203 20:14:51.706785 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"f181c4fccef6b5098a5cf8c94ad1bdd8de61d3ad6b762757fe9a7bdfb22c63a5"} Dec 03 20:14:51.709606 master-0 kubenswrapper[29252]: I1203 20:14:51.708854 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"e1de45814aaa6ff018c658de9e1964f6f62f6d7e90feb94cdb56eb7462f11358"} Dec 03 20:14:51.709606 master-0 kubenswrapper[29252]: I1203 20:14:51.708906 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"3ad78927878b51cd7f4848ccccfad56ebf4205f807aa88db1abae4a630d88721"} Dec 03 20:14:51.724660 master-0 kubenswrapper[29252]: I1203 20:14:51.724605 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"cc812f1341fe3812b2194e406d01e9f3966f2f71777d65c45218b6e90bf19156"} Dec 03 20:14:51.724934 master-0 kubenswrapper[29252]: I1203 20:14:51.724915 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" event={"ID":"922b1203-f140-45bb-94a2-6efb31cf5ee8","Type":"ContainerStarted","Data":"c6dbd6fdb952afaf9f397a6bb4ff7a9c8b6cbe444360d8b3285583451a25af68"} Dec 03 20:14:51.725643 master-0 kubenswrapper[29252]: I1203 20:14:51.725621 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:51.730279 master-0 kubenswrapper[29252]: I1203 20:14:51.730247 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" event={"ID":"00d60e6a-6ad3-4109-bb1e-30e656b91dc9","Type":"ContainerStarted","Data":"f85236327e89492bf8f806f86b542d914ccfc4a6e2a11e2c2524451f68dbc960"} Dec 03 20:14:51.730408 master-0 kubenswrapper[29252]: I1203 20:14:51.730392 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" event={"ID":"00d60e6a-6ad3-4109-bb1e-30e656b91dc9","Type":"ContainerStarted","Data":"cfc7ac6f459d729773247c82f93fa803ebe42b770a614bc8b450ab6b674fd570"} Dec 03 20:14:51.734242 master-0 kubenswrapper[29252]: I1203 20:14:51.732992 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" Dec 03 20:14:51.735397 master-0 kubenswrapper[29252]: I1203 20:14:51.735335 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"34ce07503848cd6aad62ba91f3c407cfb5a322733a238fe0815d01f74c614873"} Dec 03 20:14:51.735481 master-0 kubenswrapper[29252]: I1203 20:14:51.735399 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"47a401e5c33c18bb1cfb970151b713d5420adf0b84eb4a88be63ba450bd5a61b"} Dec 03 20:14:51.735481 master-0 kubenswrapper[29252]: I1203 20:14:51.735421 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"41edc0c3867479b941ec31180dac8bca736f22ef5242e5d1acff2ee882afe88a"} Dec 03 20:14:52.497702 master-0 kubenswrapper[29252]: I1203 20:14:52.497594 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65c74dc56f-mlqjw" podStartSLOduration=5.497563072 podStartE2EDuration="5.497563072s" podCreationTimestamp="2025-12-03 20:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:14:52.494903518 +0000 UTC m=+327.308448511" watchObservedRunningTime="2025-12-03 20:14:52.497563072 +0000 UTC m=+327.311108035" Dec 03 20:14:52.762757 master-0 kubenswrapper[29252]: I1203 20:14:52.761202 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"a47d8d5fc9fb8d8f5c161e8c2f4a0a8e14e1a13017007439234149dd1f6a68f4"} Dec 03 20:14:55.797821 master-0 kubenswrapper[29252]: I1203 20:14:55.793731 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"0bd8d740d658995d8dd28eae580836ac58f46ce3526c9a1e22c7f53333d01c60"} Dec 03 20:14:57.656974 master-0 kubenswrapper[29252]: I1203 20:14:57.656914 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:57.657992 master-0 kubenswrapper[29252]: I1203 20:14:57.657123 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:14:57.659189 master-0 kubenswrapper[29252]: I1203 20:14:57.659125 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:14:57.659349 master-0 kubenswrapper[29252]: I1203 20:14:57.659195 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:14:58.661000 master-0 kubenswrapper[29252]: I1203 20:14:58.660845 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 20:14:58.672949 master-0 kubenswrapper[29252]: I1203 20:14:58.672033 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-54c84f8475-9xl5s" podStartSLOduration=12.202989503 podStartE2EDuration="24.671989747s" podCreationTimestamp="2025-12-03 20:14:34 +0000 UTC" firstStartedPulling="2025-12-03 20:14:37.653468307 +0000 UTC m=+312.467013260" lastFinishedPulling="2025-12-03 20:14:50.122468551 +0000 UTC m=+324.936013504" observedRunningTime="2025-12-03 20:14:58.653492029 +0000 UTC m=+333.467036992" watchObservedRunningTime="2025-12-03 20:14:58.671989747 +0000 UTC m=+333.485534700" Dec 03 20:14:58.728726 master-0 kubenswrapper[29252]: I1203 20:14:58.726465 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5c6d5cb75d-gcw4h" podStartSLOduration=11.509907466 podStartE2EDuration="19.726442438s" podCreationTimestamp="2025-12-03 20:14:39 +0000 UTC" firstStartedPulling="2025-12-03 20:14:41.879224702 +0000 UTC m=+316.692769665" lastFinishedPulling="2025-12-03 20:14:50.095759684 +0000 UTC m=+324.909304637" observedRunningTime="2025-12-03 20:14:58.69971874 +0000 UTC m=+333.513263683" watchObservedRunningTime="2025-12-03 20:14:58.726442438 +0000 UTC m=+333.539987391" Dec 03 20:14:58.830394 master-0 kubenswrapper[29252]: I1203 20:14:58.830339 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerStarted","Data":"1451a79631eaf16c9eb478f51661577aa37eaea15ed18eb83a425743c7c87e7e"} Dec 03 20:14:58.872955 master-0 kubenswrapper[29252]: I1203 20:14:58.869262 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=11.707484115 podStartE2EDuration="25.869244001s" podCreationTimestamp="2025-12-03 20:14:33 +0000 UTC" firstStartedPulling="2025-12-03 20:14:35.934931939 +0000 UTC m=+310.748476892" lastFinishedPulling="2025-12-03 20:14:50.096691825 +0000 UTC m=+324.910236778" observedRunningTime="2025-12-03 20:14:58.863518962 +0000 UTC m=+333.677063935" watchObservedRunningTime="2025-12-03 20:14:58.869244001 +0000 UTC m=+333.682788954" Dec 03 20:14:59.959882 master-0 kubenswrapper[29252]: I1203 20:14:59.959815 29252 patch_prober.go:28] interesting pod/console-5b5d787587-g9t7c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.100:8443/health\": dial tcp 10.128.0.100:8443: connect: connection refused" start-of-body= Dec 03 20:14:59.960415 master-0 kubenswrapper[29252]: I1203 20:14:59.959902 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5b5d787587-g9t7c" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" probeResult="failure" output="Get \"https://10.128.0.100:8443/health\": dial tcp 10.128.0.100:8443: connect: connection refused" Dec 03 20:15:00.161770 master-0 kubenswrapper[29252]: I1203 20:15:00.161594 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2"] Dec 03 20:15:00.162812 master-0 kubenswrapper[29252]: I1203 20:15:00.162746 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.169983 master-0 kubenswrapper[29252]: I1203 20:15:00.169929 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nqzgz" Dec 03 20:15:00.172257 master-0 kubenswrapper[29252]: I1203 20:15:00.172116 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 20:15:00.178359 master-0 kubenswrapper[29252]: I1203 20:15:00.178308 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2"] Dec 03 20:15:00.299657 master-0 kubenswrapper[29252]: I1203 20:15:00.299521 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.299657 master-0 kubenswrapper[29252]: I1203 20:15:00.299601 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.299657 master-0 kubenswrapper[29252]: I1203 20:15:00.299651 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8wf\" (UniqueName: \"kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.411132 master-0 kubenswrapper[29252]: I1203 20:15:00.401552 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.411132 master-0 kubenswrapper[29252]: I1203 20:15:00.401634 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.411132 master-0 kubenswrapper[29252]: I1203 20:15:00.401712 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8wf\" (UniqueName: \"kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.411132 master-0 kubenswrapper[29252]: I1203 20:15:00.406070 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.411132 master-0 kubenswrapper[29252]: I1203 20:15:00.407320 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:00.667057 master-0 kubenswrapper[29252]: I1203 20:15:00.666979 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:15:00.667057 master-0 kubenswrapper[29252]: I1203 20:15:00.667040 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:15:01.001346 master-0 kubenswrapper[29252]: I1203 20:15:00.996105 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8wf\" (UniqueName: \"kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf\") pod \"collect-profiles-29413215-cb4r2\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:01.101446 master-0 kubenswrapper[29252]: I1203 20:15:01.101387 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:01.317824 master-0 kubenswrapper[29252]: I1203 20:15:01.317669 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" containerID="cri-o://99029e5e551e0cf40043fe1556eb3f5a583127a54d7df23af10275f14e3ca238" gracePeriod=15 Dec 03 20:15:01.444203 master-0 kubenswrapper[29252]: I1203 20:15:01.440903 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:15:01.491592 master-0 kubenswrapper[29252]: I1203 20:15:01.491537 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:15:01.492680 master-0 kubenswrapper[29252]: I1203 20:15:01.492663 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.504721 master-0 kubenswrapper[29252]: I1203 20:15:01.504661 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526096 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526186 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526212 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lsj6\" (UniqueName: \"kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526869 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526908 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.526963 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.533796 master-0 kubenswrapper[29252]: I1203 20:15:01.527018 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628026 master-0 kubenswrapper[29252]: I1203 20:15:01.627966 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628036 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628127 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628167 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628196 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lsj6\" (UniqueName: \"kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628254 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.628370 master-0 kubenswrapper[29252]: I1203 20:15:01.628278 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.629256 master-0 kubenswrapper[29252]: I1203 20:15:01.629224 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.629363 master-0 kubenswrapper[29252]: I1203 20:15:01.629322 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.629613 master-0 kubenswrapper[29252]: I1203 20:15:01.629575 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.629664 master-0 kubenswrapper[29252]: I1203 20:15:01.629608 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.631091 master-0 kubenswrapper[29252]: I1203 20:15:01.631051 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.632167 master-0 kubenswrapper[29252]: I1203 20:15:01.632136 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.645512 master-0 kubenswrapper[29252]: I1203 20:15:01.645469 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lsj6\" (UniqueName: \"kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6\") pod \"console-6465b775c-7mmtn\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.822140 master-0 kubenswrapper[29252]: I1203 20:15:01.822076 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:01.858117 master-0 kubenswrapper[29252]: I1203 20:15:01.858069 29252 generic.go:334] "Generic (PLEG): container finished" podID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerID="99029e5e551e0cf40043fe1556eb3f5a583127a54d7df23af10275f14e3ca238" exitCode=0 Dec 03 20:15:01.858117 master-0 kubenswrapper[29252]: I1203 20:15:01.858112 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" event={"ID":"8db6fff6-9e07-4c7d-97b7-dea394f706c6","Type":"ContainerDied","Data":"99029e5e551e0cf40043fe1556eb3f5a583127a54d7df23af10275f14e3ca238"} Dec 03 20:15:04.641487 master-0 kubenswrapper[29252]: I1203 20:15:04.641406 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5b6f946576-zgpxr" podUID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" containerName="console" containerID="cri-o://568b90ae97cb9fa30fe2248b862ed4b9d85f7d7109a889f6a7199ab4cbf90805" gracePeriod=15 Dec 03 20:15:04.880498 master-0 kubenswrapper[29252]: I1203 20:15:04.880457 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b6f946576-zgpxr_33f0cc6e-2015-4c7e-848f-ccca37ad61c4/console/0.log" Dec 03 20:15:04.880719 master-0 kubenswrapper[29252]: I1203 20:15:04.880536 29252 generic.go:334] "Generic (PLEG): container finished" podID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" containerID="568b90ae97cb9fa30fe2248b862ed4b9d85f7d7109a889f6a7199ab4cbf90805" exitCode=2 Dec 03 20:15:04.880719 master-0 kubenswrapper[29252]: I1203 20:15:04.880569 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b6f946576-zgpxr" event={"ID":"33f0cc6e-2015-4c7e-848f-ccca37ad61c4","Type":"ContainerDied","Data":"568b90ae97cb9fa30fe2248b862ed4b9d85f7d7109a889f6a7199ab4cbf90805"} Dec 03 20:15:07.657067 master-0 kubenswrapper[29252]: I1203 20:15:07.656601 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:07.657067 master-0 kubenswrapper[29252]: I1203 20:15:07.656666 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:09.431044 master-0 kubenswrapper[29252]: I1203 20:15:09.430418 29252 patch_prober.go:28] interesting pod/oauth-openshift-65d8f97447-xswx9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.90:6443/healthz\": dial tcp 10.128.0.90:6443: connect: connection refused" start-of-body= Dec 03 20:15:09.431044 master-0 kubenswrapper[29252]: I1203 20:15:09.430492 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.90:6443/healthz\": dial tcp 10.128.0.90:6443: connect: connection refused" Dec 03 20:15:12.342248 master-0 kubenswrapper[29252]: I1203 20:15:12.342152 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6fbbdc9bc8-g94br" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerName="console" containerID="cri-o://3f031dce6a1dc6f83e22c81233f26ae445eb5ba510b6d092c4d465872022408d" gracePeriod=15 Dec 03 20:15:12.939066 master-0 kubenswrapper[29252]: I1203 20:15:12.938946 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fbbdc9bc8-g94br_d3f987dc-c7cb-4818-a321-6b92375224a0/console/0.log" Dec 03 20:15:12.939066 master-0 kubenswrapper[29252]: I1203 20:15:12.939013 29252 generic.go:334] "Generic (PLEG): container finished" podID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerID="3f031dce6a1dc6f83e22c81233f26ae445eb5ba510b6d092c4d465872022408d" exitCode=2 Dec 03 20:15:12.939066 master-0 kubenswrapper[29252]: I1203 20:15:12.939049 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbbdc9bc8-g94br" event={"ID":"d3f987dc-c7cb-4818-a321-6b92375224a0","Type":"ContainerDied","Data":"3f031dce6a1dc6f83e22c81233f26ae445eb5ba510b6d092c4d465872022408d"} Dec 03 20:15:14.261965 master-0 kubenswrapper[29252]: W1203 20:15:14.261662 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod108176a9_101d_4204_8ed3_4ed41ccdaae0.slice/crio-16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956 WatchSource:0}: Error finding container 16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956: Status 404 returned error can't find the container with id 16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956 Dec 03 20:15:14.356242 master-0 kubenswrapper[29252]: I1203 20:15:14.356105 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:15:14.366864 master-0 kubenswrapper[29252]: I1203 20:15:14.362928 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fbbdc9bc8-g94br_d3f987dc-c7cb-4818-a321-6b92375224a0/console/0.log" Dec 03 20:15:14.366864 master-0 kubenswrapper[29252]: I1203 20:15:14.362986 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:15:14.397511 master-0 kubenswrapper[29252]: I1203 20:15:14.397469 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65494c7-95b4g"] Dec 03 20:15:14.397821 master-0 kubenswrapper[29252]: E1203 20:15:14.397805 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerName="console" Dec 03 20:15:14.397894 master-0 kubenswrapper[29252]: I1203 20:15:14.397822 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerName="console" Dec 03 20:15:14.397894 master-0 kubenswrapper[29252]: E1203 20:15:14.397836 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" Dec 03 20:15:14.397894 master-0 kubenswrapper[29252]: I1203 20:15:14.397842 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" Dec 03 20:15:14.398451 master-0 kubenswrapper[29252]: I1203 20:15:14.398006 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" containerName="console" Dec 03 20:15:14.398451 master-0 kubenswrapper[29252]: I1203 20:15:14.398048 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" containerName="oauth-openshift" Dec 03 20:15:14.398825 master-0 kubenswrapper[29252]: I1203 20:15:14.398504 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.418606 master-0 kubenswrapper[29252]: I1203 20:15:14.418559 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65494c7-95b4g"] Dec 03 20:15:14.462822 master-0 kubenswrapper[29252]: I1203 20:15:14.462727 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.462822 master-0 kubenswrapper[29252]: I1203 20:15:14.462802 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.462832 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.462863 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.462897 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.462940 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj7sv\" (UniqueName: \"kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.462974 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463000 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463023 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463053 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463082 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463124 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463171 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmcr9\" (UniqueName: \"kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463207 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463240 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463267 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463302 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463357 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463383 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca\") pod \"d3f987dc-c7cb-4818-a321-6b92375224a0\" (UID: \"d3f987dc-c7cb-4818-a321-6b92375224a0\") " Dec 03 20:15:14.463438 master-0 kubenswrapper[29252]: I1203 20:15:14.463406 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig\") pod \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\" (UID: \"8db6fff6-9e07-4c7d-97b7-dea394f706c6\") " Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.463637 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-session\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.463670 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-service-ca\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.463797 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-dir\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.463878 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-error\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.463910 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-policies\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464018 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464098 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464124 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psks2\" (UniqueName: \"kubernetes.io/projected/d45f85ea-34d0-4a87-9c62-39101a4756af-kube-api-access-psks2\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464151 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-router-certs\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464182 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464224 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464257 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.464281 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-login\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.466208 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.466328 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.467116 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.467281 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.467789 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.468170 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca" (OuterVolumeSpecName: "service-ca") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.468364 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.468511 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config" (OuterVolumeSpecName: "console-config") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.469535 master-0 kubenswrapper[29252]: I1203 20:15:14.468525 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.472653 master-0 kubenswrapper[29252]: I1203 20:15:14.470975 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.473012 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.474024 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9" (OuterVolumeSpecName: "kube-api-access-pmcr9") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "kube-api-access-pmcr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.474179 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.474246 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.474967 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.475878 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.477269 master-0 kubenswrapper[29252]: I1203 20:15:14.476190 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d3f987dc-c7cb-4818-a321-6b92375224a0" (UID: "d3f987dc-c7cb-4818-a321-6b92375224a0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.484226 master-0 kubenswrapper[29252]: I1203 20:15:14.484175 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.484307 master-0 kubenswrapper[29252]: I1203 20:15:14.484234 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv" (OuterVolumeSpecName: "kube-api-access-rj7sv") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "kube-api-access-rj7sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:14.487988 master-0 kubenswrapper[29252]: I1203 20:15:14.487920 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8db6fff6-9e07-4c7d-97b7-dea394f706c6" (UID: "8db6fff6-9e07-4c7d-97b7-dea394f706c6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.562711 master-0 kubenswrapper[29252]: I1203 20:15:14.561592 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b6f946576-zgpxr_33f0cc6e-2015-4c7e-848f-ccca37ad61c4/console/0.log" Dec 03 20:15:14.562711 master-0 kubenswrapper[29252]: I1203 20:15:14.561665 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567652 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567697 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567722 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567739 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-login\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567787 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-session\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567807 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-service-ca\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567838 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-dir\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567870 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-error\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.567886 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-policies\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.569362 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.569402 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.571584 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psks2\" (UniqueName: \"kubernetes.io/projected/d45f85ea-34d0-4a87-9c62-39101a4756af-kube-api-access-psks2\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.572301 master-0 kubenswrapper[29252]: I1203 20:15:14.571661 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-router-certs\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572380 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572405 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmcr9\" (UniqueName: \"kubernetes.io/projected/d3f987dc-c7cb-4818-a321-6b92375224a0-kube-api-access-pmcr9\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572417 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572431 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572443 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572458 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572469 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572479 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572494 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572555 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572576 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572585 29252 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-policies\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572597 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572639 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572650 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj7sv\" (UniqueName: \"kubernetes.io/projected/8db6fff6-9e07-4c7d-97b7-dea394f706c6-kube-api-access-rj7sv\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572660 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f987dc-c7cb-4818-a321-6b92375224a0-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572770 29252 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8db6fff6-9e07-4c7d-97b7-dea394f706c6-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572800 29252 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f987dc-c7cb-4818-a321-6b92375224a0-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572813 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.573921 master-0 kubenswrapper[29252]: I1203 20:15:14.572835 29252 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8db6fff6-9e07-4c7d-97b7-dea394f706c6-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.586630 master-0 kubenswrapper[29252]: I1203 20:15:14.575226 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-policies\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.586630 master-0 kubenswrapper[29252]: I1203 20:15:14.575771 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d45f85ea-34d0-4a87-9c62-39101a4756af-audit-dir\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.586630 master-0 kubenswrapper[29252]: I1203 20:15:14.577481 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.586630 master-0 kubenswrapper[29252]: I1203 20:15:14.578686 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-service-ca\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.586630 master-0 kubenswrapper[29252]: I1203 20:15:14.579010 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.587389 master-0 kubenswrapper[29252]: I1203 20:15:14.587311 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-login\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.588200 master-0 kubenswrapper[29252]: I1203 20:15:14.588161 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.591230 master-0 kubenswrapper[29252]: I1203 20:15:14.591167 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-router-certs\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.605878 master-0 kubenswrapper[29252]: I1203 20:15:14.605623 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-session\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.607168 master-0 kubenswrapper[29252]: I1203 20:15:14.607094 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.607246 master-0 kubenswrapper[29252]: I1203 20:15:14.607223 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.607361 master-0 kubenswrapper[29252]: I1203 20:15:14.607337 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d45f85ea-34d0-4a87-9c62-39101a4756af-v4-0-config-user-template-error\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.609566 master-0 kubenswrapper[29252]: I1203 20:15:14.609539 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psks2\" (UniqueName: \"kubernetes.io/projected/d45f85ea-34d0-4a87-9c62-39101a4756af-kube-api-access-psks2\") pod \"oauth-openshift-65494c7-95b4g\" (UID: \"d45f85ea-34d0-4a87-9c62-39101a4756af\") " pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.674295 master-0 kubenswrapper[29252]: I1203 20:15:14.674237 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6k8b\" (UniqueName: \"kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.674543 master-0 kubenswrapper[29252]: I1203 20:15:14.674338 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.675098 master-0 kubenswrapper[29252]: I1203 20:15:14.675063 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.675229 master-0 kubenswrapper[29252]: I1203 20:15:14.675205 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.675320 master-0 kubenswrapper[29252]: I1203 20:15:14.675299 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.675357 master-0 kubenswrapper[29252]: I1203 20:15:14.675325 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca\") pod \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\" (UID: \"33f0cc6e-2015-4c7e-848f-ccca37ad61c4\") " Dec 03 20:15:14.675971 master-0 kubenswrapper[29252]: I1203 20:15:14.675927 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config" (OuterVolumeSpecName: "console-config") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.676034 master-0 kubenswrapper[29252]: I1203 20:15:14.675971 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.676034 master-0 kubenswrapper[29252]: I1203 20:15:14.675952 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:14.678326 master-0 kubenswrapper[29252]: I1203 20:15:14.678262 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b" (OuterVolumeSpecName: "kube-api-access-l6k8b") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "kube-api-access-l6k8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:14.679321 master-0 kubenswrapper[29252]: I1203 20:15:14.679282 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.682481 master-0 kubenswrapper[29252]: I1203 20:15:14.682443 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "33f0cc6e-2015-4c7e-848f-ccca37ad61c4" (UID: "33f0cc6e-2015-4c7e-848f-ccca37ad61c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:14.716508 master-0 kubenswrapper[29252]: I1203 20:15:14.716465 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:14.777192 master-0 kubenswrapper[29252]: I1203 20:15:14.777146 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.777192 master-0 kubenswrapper[29252]: I1203 20:15:14.777192 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.777403 master-0 kubenswrapper[29252]: I1203 20:15:14.777205 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.777403 master-0 kubenswrapper[29252]: I1203 20:15:14.777223 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6k8b\" (UniqueName: \"kubernetes.io/projected/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-kube-api-access-l6k8b\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.777403 master-0 kubenswrapper[29252]: I1203 20:15:14.777236 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.777403 master-0 kubenswrapper[29252]: I1203 20:15:14.777246 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f0cc6e-2015-4c7e-848f-ccca37ad61c4-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:14.865063 master-0 kubenswrapper[29252]: I1203 20:15:14.864983 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2"] Dec 03 20:15:14.949344 master-0 kubenswrapper[29252]: I1203 20:15:14.949290 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:15:14.958032 master-0 kubenswrapper[29252]: W1203 20:15:14.957984 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2903de_a51a_415a_80be_9ba79b4e173d.slice/crio-4f93b8b809a825c50c7be9c2190dbcd0c45c4c033c75a39e6240c325280e1f11 WatchSource:0}: Error finding container 4f93b8b809a825c50c7be9c2190dbcd0c45c4c033c75a39e6240c325280e1f11: Status 404 returned error can't find the container with id 4f93b8b809a825c50c7be9c2190dbcd0c45c4c033c75a39e6240c325280e1f11 Dec 03 20:15:14.965560 master-0 kubenswrapper[29252]: I1203 20:15:14.962683 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fbbdc9bc8-g94br_d3f987dc-c7cb-4818-a321-6b92375224a0/console/0.log" Dec 03 20:15:14.965560 master-0 kubenswrapper[29252]: I1203 20:15:14.962760 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbbdc9bc8-g94br" event={"ID":"d3f987dc-c7cb-4818-a321-6b92375224a0","Type":"ContainerDied","Data":"69f3052cc78de7d39af42d8819f14ebf2072ed3a299de11beb056fbaaefe82f0"} Dec 03 20:15:14.965560 master-0 kubenswrapper[29252]: I1203 20:15:14.962818 29252 scope.go:117] "RemoveContainer" containerID="3f031dce6a1dc6f83e22c81233f26ae445eb5ba510b6d092c4d465872022408d" Dec 03 20:15:14.965560 master-0 kubenswrapper[29252]: I1203 20:15:14.962816 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbbdc9bc8-g94br" Dec 03 20:15:14.984414 master-0 kubenswrapper[29252]: I1203 20:15:14.984318 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerStarted","Data":"da1fa4988d2b94cb2e703112b63c61fa999bce00d13cedeb5375da4e716443a5"} Dec 03 20:15:14.986381 master-0 kubenswrapper[29252]: I1203 20:15:14.986342 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" event={"ID":"bb848be3-861d-4b35-b3ee-1b19720c1e4c","Type":"ContainerStarted","Data":"a056d678ab30e74641c28663c990f8f3d9a2c5e3fa84122a7d4e39c841d573b7"} Dec 03 20:15:14.988987 master-0 kubenswrapper[29252]: I1203 20:15:14.988955 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"108176a9-101d-4204-8ed3-4ed41ccdaae0","Type":"ContainerStarted","Data":"bb9066638a39b91f2bc07e934233abcb8b6f75e527303c906c13e755c9e51aee"} Dec 03 20:15:14.989067 master-0 kubenswrapper[29252]: I1203 20:15:14.988995 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"108176a9-101d-4204-8ed3-4ed41ccdaae0","Type":"ContainerStarted","Data":"16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956"} Dec 03 20:15:14.990827 master-0 kubenswrapper[29252]: I1203 20:15:14.990749 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b6f946576-zgpxr_33f0cc6e-2015-4c7e-848f-ccca37ad61c4/console/0.log" Dec 03 20:15:14.991115 master-0 kubenswrapper[29252]: I1203 20:15:14.991039 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b6f946576-zgpxr" Dec 03 20:15:14.991115 master-0 kubenswrapper[29252]: I1203 20:15:14.991067 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b6f946576-zgpxr" event={"ID":"33f0cc6e-2015-4c7e-848f-ccca37ad61c4","Type":"ContainerDied","Data":"8ab6f76bd1e5bfd809855b37cc1500209792bebbeb2a5ca5cd9846c952e06d3f"} Dec 03 20:15:14.992533 master-0 kubenswrapper[29252]: I1203 20:15:14.992495 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" event={"ID":"8db6fff6-9e07-4c7d-97b7-dea394f706c6","Type":"ContainerDied","Data":"9243afb7e08524d8a4682638e082352b84f1ae98af760a018077fe7f49f550aa"} Dec 03 20:15:14.992607 master-0 kubenswrapper[29252]: I1203 20:15:14.992533 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65d8f97447-xswx9" Dec 03 20:15:14.994817 master-0 kubenswrapper[29252]: I1203 20:15:14.994175 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6f5db8559b-hd8bd" event={"ID":"68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64","Type":"ContainerStarted","Data":"963634d430d9d5c23cb8c3fe69cfd0f9f3ca8933ef1f41447399532ed67a401f"} Dec 03 20:15:14.994817 master-0 kubenswrapper[29252]: I1203 20:15:14.994372 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:15:14.996251 master-0 kubenswrapper[29252]: I1203 20:15:14.996213 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" event={"ID":"0590f508-0371-4369-a9f7-ee4ef5acbcac","Type":"ContainerStarted","Data":"793f124a9da8c287a38a5c16353dbeea1542cccdac936e8274f5b81cb6c30a38"} Dec 03 20:15:14.996720 master-0 kubenswrapper[29252]: I1203 20:15:14.996691 29252 patch_prober.go:28] interesting pod/downloads-6f5db8559b-hd8bd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Dec 03 20:15:14.996810 master-0 kubenswrapper[29252]: I1203 20:15:14.996742 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-hd8bd" podUID="68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Dec 03 20:15:15.000498 master-0 kubenswrapper[29252]: I1203 20:15:15.000478 29252 scope.go:117] "RemoveContainer" containerID="568b90ae97cb9fa30fe2248b862ed4b9d85f7d7109a889f6a7199ab4cbf90805" Dec 03 20:15:15.024052 master-0 kubenswrapper[29252]: I1203 20:15:15.023958 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=30.37737162 podStartE2EDuration="35.0239392s" podCreationTimestamp="2025-12-03 20:14:40 +0000 UTC" firstStartedPulling="2025-12-03 20:14:45.608309272 +0000 UTC m=+320.421854235" lastFinishedPulling="2025-12-03 20:14:50.254876862 +0000 UTC m=+325.068421815" observedRunningTime="2025-12-03 20:15:15.01781842 +0000 UTC m=+349.831363393" watchObservedRunningTime="2025-12-03 20:15:15.0239392 +0000 UTC m=+349.837484153" Dec 03 20:15:15.039314 master-0 kubenswrapper[29252]: I1203 20:15:15.039230 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6f5db8559b-hd8bd" podStartSLOduration=2.084717619 podStartE2EDuration="50.039141117s" podCreationTimestamp="2025-12-03 20:14:25 +0000 UTC" firstStartedPulling="2025-12-03 20:14:26.628490366 +0000 UTC m=+301.442035329" lastFinishedPulling="2025-12-03 20:15:14.582913874 +0000 UTC m=+349.396458827" observedRunningTime="2025-12-03 20:15:15.034908125 +0000 UTC m=+349.848453088" watchObservedRunningTime="2025-12-03 20:15:15.039141117 +0000 UTC m=+349.852686080" Dec 03 20:15:15.054633 master-0 kubenswrapper[29252]: I1203 20:15:15.054545 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=25.054528311 podStartE2EDuration="25.054528311s" podCreationTimestamp="2025-12-03 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:15:15.053572008 +0000 UTC m=+349.867116981" watchObservedRunningTime="2025-12-03 20:15:15.054528311 +0000 UTC m=+349.868073264" Dec 03 20:15:15.070546 master-0 kubenswrapper[29252]: I1203 20:15:15.070482 29252 scope.go:117] "RemoveContainer" containerID="99029e5e551e0cf40043fe1556eb3f5a583127a54d7df23af10275f14e3ca238" Dec 03 20:15:15.086406 master-0 kubenswrapper[29252]: I1203 20:15:15.086317 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c696657b7-nvqxt" podStartSLOduration=5.319600353 podStartE2EDuration="29.086297751s" podCreationTimestamp="2025-12-03 20:14:46 +0000 UTC" firstStartedPulling="2025-12-03 20:14:50.636034215 +0000 UTC m=+325.449579178" lastFinishedPulling="2025-12-03 20:15:14.402731623 +0000 UTC m=+349.216276576" observedRunningTime="2025-12-03 20:15:15.078629776 +0000 UTC m=+349.892174749" watchObservedRunningTime="2025-12-03 20:15:15.086297751 +0000 UTC m=+349.899842724" Dec 03 20:15:15.154551 master-0 kubenswrapper[29252]: W1203 20:15:15.154508 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45f85ea_34d0_4a87_9c62_39101a4756af.slice/crio-6e26e83fc8f9094d4f68a106cc213ad2b25bbc7a1d1c38ba741fedc7e4fb4c11 WatchSource:0}: Error finding container 6e26e83fc8f9094d4f68a106cc213ad2b25bbc7a1d1c38ba741fedc7e4fb4c11: Status 404 returned error can't find the container with id 6e26e83fc8f9094d4f68a106cc213ad2b25bbc7a1d1c38ba741fedc7e4fb4c11 Dec 03 20:15:15.158606 master-0 kubenswrapper[29252]: I1203 20:15:15.158500 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:15:15.171268 master-0 kubenswrapper[29252]: I1203 20:15:15.171227 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65494c7-95b4g"] Dec 03 20:15:15.180206 master-0 kubenswrapper[29252]: I1203 20:15:15.180134 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b6f946576-zgpxr"] Dec 03 20:15:15.183497 master-0 kubenswrapper[29252]: I1203 20:15:15.183465 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:15:15.188794 master-0 kubenswrapper[29252]: I1203 20:15:15.188723 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fbbdc9bc8-g94br"] Dec 03 20:15:15.195943 master-0 kubenswrapper[29252]: I1203 20:15:15.195900 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:15:15.200123 master-0 kubenswrapper[29252]: I1203 20:15:15.200011 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-65d8f97447-xswx9"] Dec 03 20:15:15.435246 master-0 kubenswrapper[29252]: I1203 20:15:15.435174 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" path="/var/lib/kubelet/pods/33f0cc6e-2015-4c7e-848f-ccca37ad61c4/volumes" Dec 03 20:15:15.436595 master-0 kubenswrapper[29252]: I1203 20:15:15.436540 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db6fff6-9e07-4c7d-97b7-dea394f706c6" path="/var/lib/kubelet/pods/8db6fff6-9e07-4c7d-97b7-dea394f706c6/volumes" Dec 03 20:15:15.439078 master-0 kubenswrapper[29252]: I1203 20:15:15.439026 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f987dc-c7cb-4818-a321-6b92375224a0" path="/var/lib/kubelet/pods/d3f987dc-c7cb-4818-a321-6b92375224a0/volumes" Dec 03 20:15:16.012374 master-0 kubenswrapper[29252]: I1203 20:15:16.012304 29252 generic.go:334] "Generic (PLEG): container finished" podID="bb848be3-861d-4b35-b3ee-1b19720c1e4c" containerID="dd3a791f6b62ad11e45e5eaff559f63ea24e04c49c9aa1dbb339381bec789a1e" exitCode=0 Dec 03 20:15:16.012615 master-0 kubenswrapper[29252]: I1203 20:15:16.012384 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" event={"ID":"bb848be3-861d-4b35-b3ee-1b19720c1e4c","Type":"ContainerDied","Data":"dd3a791f6b62ad11e45e5eaff559f63ea24e04c49c9aa1dbb339381bec789a1e"} Dec 03 20:15:16.015594 master-0 kubenswrapper[29252]: I1203 20:15:16.015540 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" event={"ID":"d45f85ea-34d0-4a87-9c62-39101a4756af","Type":"ContainerStarted","Data":"b4962673a1ecc32a15981c62291d73c4a9bd3ab0206360cbfd6667e8e0fe6595"} Dec 03 20:15:16.015689 master-0 kubenswrapper[29252]: I1203 20:15:16.015604 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" event={"ID":"d45f85ea-34d0-4a87-9c62-39101a4756af","Type":"ContainerStarted","Data":"6e26e83fc8f9094d4f68a106cc213ad2b25bbc7a1d1c38ba741fedc7e4fb4c11"} Dec 03 20:15:16.015859 master-0 kubenswrapper[29252]: I1203 20:15:16.015810 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:16.023814 master-0 kubenswrapper[29252]: I1203 20:15:16.022875 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6465b775c-7mmtn" event={"ID":"3d2903de-a51a-415a-80be-9ba79b4e173d","Type":"ContainerStarted","Data":"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68"} Dec 03 20:15:16.023814 master-0 kubenswrapper[29252]: I1203 20:15:16.022938 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6465b775c-7mmtn" event={"ID":"3d2903de-a51a-415a-80be-9ba79b4e173d","Type":"ContainerStarted","Data":"4f93b8b809a825c50c7be9c2190dbcd0c45c4c033c75a39e6240c325280e1f11"} Dec 03 20:15:16.023814 master-0 kubenswrapper[29252]: I1203 20:15:16.023361 29252 patch_prober.go:28] interesting pod/downloads-6f5db8559b-hd8bd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Dec 03 20:15:16.023814 master-0 kubenswrapper[29252]: I1203 20:15:16.023437 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-hd8bd" podUID="68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Dec 03 20:15:16.024210 master-0 kubenswrapper[29252]: I1203 20:15:16.024173 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" Dec 03 20:15:16.082852 master-0 kubenswrapper[29252]: I1203 20:15:16.082729 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65494c7-95b4g" podStartSLOduration=26.082702557 podStartE2EDuration="26.082702557s" podCreationTimestamp="2025-12-03 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:15:16.078455024 +0000 UTC m=+350.891999977" watchObservedRunningTime="2025-12-03 20:15:16.082702557 +0000 UTC m=+350.896247530" Dec 03 20:15:16.093588 master-0 kubenswrapper[29252]: I1203 20:15:16.092272 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:15:16.119148 master-0 kubenswrapper[29252]: I1203 20:15:16.119094 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6465b775c-7mmtn" podStartSLOduration=15.119079419 podStartE2EDuration="15.119079419s" podCreationTimestamp="2025-12-03 20:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:15:16.113290109 +0000 UTC m=+350.926835132" watchObservedRunningTime="2025-12-03 20:15:16.119079419 +0000 UTC m=+350.932624372" Dec 03 20:15:16.207816 master-0 kubenswrapper[29252]: I1203 20:15:16.207247 29252 patch_prober.go:28] interesting pod/downloads-6f5db8559b-hd8bd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Dec 03 20:15:16.207816 master-0 kubenswrapper[29252]: I1203 20:15:16.207307 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-6f5db8559b-hd8bd" podUID="68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Dec 03 20:15:16.213128 master-0 kubenswrapper[29252]: I1203 20:15:16.212889 29252 patch_prober.go:28] interesting pod/downloads-6f5db8559b-hd8bd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Dec 03 20:15:16.213128 master-0 kubenswrapper[29252]: I1203 20:15:16.212953 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-hd8bd" podUID="68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Dec 03 20:15:17.474095 master-0 kubenswrapper[29252]: I1203 20:15:17.474057 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:17.542885 master-0 kubenswrapper[29252]: I1203 20:15:17.542819 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume\") pod \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " Dec 03 20:15:17.543335 master-0 kubenswrapper[29252]: I1203 20:15:17.542971 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume\") pod \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " Dec 03 20:15:17.543335 master-0 kubenswrapper[29252]: I1203 20:15:17.543004 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8wf\" (UniqueName: \"kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf\") pod \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\" (UID: \"bb848be3-861d-4b35-b3ee-1b19720c1e4c\") " Dec 03 20:15:17.543862 master-0 kubenswrapper[29252]: I1203 20:15:17.543735 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb848be3-861d-4b35-b3ee-1b19720c1e4c" (UID: "bb848be3-861d-4b35-b3ee-1b19720c1e4c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:17.545883 master-0 kubenswrapper[29252]: I1203 20:15:17.545740 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf" (OuterVolumeSpecName: "kube-api-access-9f8wf") pod "bb848be3-861d-4b35-b3ee-1b19720c1e4c" (UID: "bb848be3-861d-4b35-b3ee-1b19720c1e4c"). InnerVolumeSpecName "kube-api-access-9f8wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:17.546945 master-0 kubenswrapper[29252]: I1203 20:15:17.546900 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb848be3-861d-4b35-b3ee-1b19720c1e4c" (UID: "bb848be3-861d-4b35-b3ee-1b19720c1e4c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:17.645046 master-0 kubenswrapper[29252]: I1203 20:15:17.644847 29252 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb848be3-861d-4b35-b3ee-1b19720c1e4c-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:17.645046 master-0 kubenswrapper[29252]: I1203 20:15:17.644918 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8wf\" (UniqueName: \"kubernetes.io/projected/bb848be3-861d-4b35-b3ee-1b19720c1e4c-kube-api-access-9f8wf\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:17.645046 master-0 kubenswrapper[29252]: I1203 20:15:17.644942 29252 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb848be3-861d-4b35-b3ee-1b19720c1e4c-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:17.656984 master-0 kubenswrapper[29252]: I1203 20:15:17.656900 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:17.657146 master-0 kubenswrapper[29252]: I1203 20:15:17.656992 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:18.045379 master-0 kubenswrapper[29252]: I1203 20:15:18.045221 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" Dec 03 20:15:18.045379 master-0 kubenswrapper[29252]: I1203 20:15:18.045278 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413215-cb4r2" event={"ID":"bb848be3-861d-4b35-b3ee-1b19720c1e4c","Type":"ContainerDied","Data":"a056d678ab30e74641c28663c990f8f3d9a2c5e3fa84122a7d4e39c841d573b7"} Dec 03 20:15:18.045379 master-0 kubenswrapper[29252]: I1203 20:15:18.045307 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a056d678ab30e74641c28663c990f8f3d9a2c5e3fa84122a7d4e39c841d573b7" Dec 03 20:15:20.674885 master-0 kubenswrapper[29252]: I1203 20:15:20.674806 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:15:20.681384 master-0 kubenswrapper[29252]: I1203 20:15:20.681320 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6b8786b56c-g7dqt" Dec 03 20:15:21.823054 master-0 kubenswrapper[29252]: I1203 20:15:21.822966 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:21.823054 master-0 kubenswrapper[29252]: I1203 20:15:21.823032 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:15:21.824293 master-0 kubenswrapper[29252]: I1203 20:15:21.824251 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:15:21.824468 master-0 kubenswrapper[29252]: I1203 20:15:21.824289 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:15:25.689453 master-0 kubenswrapper[29252]: I1203 20:15:25.689377 29252 scope.go:117] "RemoveContainer" containerID="10dd5e50757ca6d8fb428d9d41440e88b1cc3fce51685a0860bb2b0898ea0950" Dec 03 20:15:26.224155 master-0 kubenswrapper[29252]: I1203 20:15:26.224049 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6f5db8559b-hd8bd" Dec 03 20:15:26.485288 master-0 kubenswrapper[29252]: I1203 20:15:26.485095 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5b5d787587-g9t7c" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" containerID="cri-o://3a9b6e3578080fa1e5c782639b46b837e227649335bdd9698dc3fcff6bb5a882" gracePeriod=15 Dec 03 20:15:27.660460 master-0 kubenswrapper[29252]: I1203 20:15:27.660130 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:27.660460 master-0 kubenswrapper[29252]: I1203 20:15:27.660248 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:28.142966 master-0 kubenswrapper[29252]: I1203 20:15:28.142899 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b5d787587-g9t7c_6426848a-3e1d-4988-9749-5e7fc2620e51/console/0.log" Dec 03 20:15:28.142966 master-0 kubenswrapper[29252]: I1203 20:15:28.142969 29252 generic.go:334] "Generic (PLEG): container finished" podID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerID="3a9b6e3578080fa1e5c782639b46b837e227649335bdd9698dc3fcff6bb5a882" exitCode=2 Dec 03 20:15:28.143407 master-0 kubenswrapper[29252]: I1203 20:15:28.143006 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5d787587-g9t7c" event={"ID":"6426848a-3e1d-4988-9749-5e7fc2620e51","Type":"ContainerDied","Data":"3a9b6e3578080fa1e5c782639b46b837e227649335bdd9698dc3fcff6bb5a882"} Dec 03 20:15:29.149619 master-0 kubenswrapper[29252]: I1203 20:15:29.149535 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b5d787587-g9t7c_6426848a-3e1d-4988-9749-5e7fc2620e51/console/0.log" Dec 03 20:15:29.150776 master-0 kubenswrapper[29252]: I1203 20:15:29.149644 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:15:29.158416 master-0 kubenswrapper[29252]: I1203 20:15:29.158364 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b5d787587-g9t7c_6426848a-3e1d-4988-9749-5e7fc2620e51/console/0.log" Dec 03 20:15:29.158605 master-0 kubenswrapper[29252]: I1203 20:15:29.158437 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5d787587-g9t7c" event={"ID":"6426848a-3e1d-4988-9749-5e7fc2620e51","Type":"ContainerDied","Data":"3d2b120dbe3c83a9d3ee67340b719b29c2f99528319c49674650d81821263c36"} Dec 03 20:15:29.158605 master-0 kubenswrapper[29252]: I1203 20:15:29.158517 29252 scope.go:117] "RemoveContainer" containerID="3a9b6e3578080fa1e5c782639b46b837e227649335bdd9698dc3fcff6bb5a882" Dec 03 20:15:29.158764 master-0 kubenswrapper[29252]: I1203 20:15:29.158579 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5d787587-g9t7c" Dec 03 20:15:29.254887 master-0 kubenswrapper[29252]: I1203 20:15:29.254760 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255123 master-0 kubenswrapper[29252]: I1203 20:15:29.254908 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255123 master-0 kubenswrapper[29252]: I1203 20:15:29.254953 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255123 master-0 kubenswrapper[29252]: I1203 20:15:29.255006 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255123 master-0 kubenswrapper[29252]: I1203 20:15:29.255051 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255123 master-0 kubenswrapper[29252]: I1203 20:15:29.255101 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255372 master-0 kubenswrapper[29252]: I1203 20:15:29.255186 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpkb\" (UniqueName: \"kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb\") pod \"6426848a-3e1d-4988-9749-5e7fc2620e51\" (UID: \"6426848a-3e1d-4988-9749-5e7fc2620e51\") " Dec 03 20:15:29.255466 master-0 kubenswrapper[29252]: I1203 20:15:29.255426 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca" (OuterVolumeSpecName: "service-ca") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:29.255466 master-0 kubenswrapper[29252]: I1203 20:15:29.255442 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:29.256050 master-0 kubenswrapper[29252]: I1203 20:15:29.255710 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.256050 master-0 kubenswrapper[29252]: I1203 20:15:29.255733 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.256174 master-0 kubenswrapper[29252]: I1203 20:15:29.256137 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config" (OuterVolumeSpecName: "console-config") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:29.256532 master-0 kubenswrapper[29252]: I1203 20:15:29.256454 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:15:29.258156 master-0 kubenswrapper[29252]: I1203 20:15:29.258092 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb" (OuterVolumeSpecName: "kube-api-access-ljpkb") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "kube-api-access-ljpkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:29.260437 master-0 kubenswrapper[29252]: I1203 20:15:29.260374 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:29.261352 master-0 kubenswrapper[29252]: I1203 20:15:29.261318 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6426848a-3e1d-4988-9749-5e7fc2620e51" (UID: "6426848a-3e1d-4988-9749-5e7fc2620e51"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:15:29.357284 master-0 kubenswrapper[29252]: I1203 20:15:29.357134 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.357284 master-0 kubenswrapper[29252]: I1203 20:15:29.357172 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.357284 master-0 kubenswrapper[29252]: I1203 20:15:29.357187 29252 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6426848a-3e1d-4988-9749-5e7fc2620e51-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.357284 master-0 kubenswrapper[29252]: I1203 20:15:29.357199 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6426848a-3e1d-4988-9749-5e7fc2620e51-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:29.357284 master-0 kubenswrapper[29252]: I1203 20:15:29.357211 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpkb\" (UniqueName: \"kubernetes.io/projected/6426848a-3e1d-4988-9749-5e7fc2620e51-kube-api-access-ljpkb\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:31.823010 master-0 kubenswrapper[29252]: I1203 20:15:31.822926 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:15:31.823715 master-0 kubenswrapper[29252]: I1203 20:15:31.823012 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:15:35.125033 master-0 kubenswrapper[29252]: I1203 20:15:35.124914 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:15:35.135457 master-0 kubenswrapper[29252]: I1203 20:15:35.135365 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 20:15:35.139945 master-0 kubenswrapper[29252]: E1203 20:15:35.139894 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" containerName="console" Dec 03 20:15:35.139945 master-0 kubenswrapper[29252]: I1203 20:15:35.139941 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" containerName="console" Dec 03 20:15:35.140064 master-0 kubenswrapper[29252]: E1203 20:15:35.139987 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" Dec 03 20:15:35.140064 master-0 kubenswrapper[29252]: I1203 20:15:35.140004 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" Dec 03 20:15:35.140124 master-0 kubenswrapper[29252]: E1203 20:15:35.140059 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb848be3-861d-4b35-b3ee-1b19720c1e4c" containerName="collect-profiles" Dec 03 20:15:35.140124 master-0 kubenswrapper[29252]: I1203 20:15:35.140078 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb848be3-861d-4b35-b3ee-1b19720c1e4c" containerName="collect-profiles" Dec 03 20:15:35.140399 master-0 kubenswrapper[29252]: I1203 20:15:35.140364 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" containerName="console" Dec 03 20:15:35.140449 master-0 kubenswrapper[29252]: I1203 20:15:35.140434 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f0cc6e-2015-4c7e-848f-ccca37ad61c4" containerName="console" Dec 03 20:15:35.140512 master-0 kubenswrapper[29252]: I1203 20:15:35.140480 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb848be3-861d-4b35-b3ee-1b19720c1e4c" containerName="collect-profiles" Dec 03 20:15:35.141414 master-0 kubenswrapper[29252]: I1203 20:15:35.141362 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.144371 master-0 kubenswrapper[29252]: I1203 20:15:35.144312 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nmjr4" Dec 03 20:15:35.144628 master-0 kubenswrapper[29252]: I1203 20:15:35.144570 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 20:15:35.167165 master-0 kubenswrapper[29252]: I1203 20:15:35.167106 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.167334 master-0 kubenswrapper[29252]: I1203 20:15:35.167202 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.167334 master-0 kubenswrapper[29252]: I1203 20:15:35.167289 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.270007 master-0 kubenswrapper[29252]: I1203 20:15:35.269873 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.270332 master-0 kubenswrapper[29252]: I1203 20:15:35.270069 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.270332 master-0 kubenswrapper[29252]: I1203 20:15:35.270090 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.270332 master-0 kubenswrapper[29252]: I1203 20:15:35.270155 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:35.270332 master-0 kubenswrapper[29252]: I1203 20:15:35.270278 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:36.042833 master-0 kubenswrapper[29252]: I1203 20:15:36.041870 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b5d787587-g9t7c"] Dec 03 20:15:36.052059 master-0 kubenswrapper[29252]: I1203 20:15:36.051974 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 20:15:37.292948 master-0 kubenswrapper[29252]: I1203 20:15:37.292876 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access\") pod \"installer-4-master-0\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:37.427139 master-0 kubenswrapper[29252]: I1203 20:15:37.427045 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6426848a-3e1d-4988-9749-5e7fc2620e51" path="/var/lib/kubelet/pods/6426848a-3e1d-4988-9749-5e7fc2620e51/volumes" Dec 03 20:15:37.573760 master-0 kubenswrapper[29252]: I1203 20:15:37.573535 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:15:37.656898 master-0 kubenswrapper[29252]: I1203 20:15:37.656771 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:37.656898 master-0 kubenswrapper[29252]: I1203 20:15:37.656902 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:40.468050 master-0 kubenswrapper[29252]: I1203 20:15:40.467970 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 20:15:40.480637 master-0 kubenswrapper[29252]: W1203 20:15:40.480548 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeee0d023_d1ab_4c75_9a92_3a0e42d05168.slice/crio-03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2 WatchSource:0}: Error finding container 03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2: Status 404 returned error can't find the container with id 03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2 Dec 03 20:15:41.088817 master-0 kubenswrapper[29252]: I1203 20:15:41.088576 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:15:41.132614 master-0 kubenswrapper[29252]: I1203 20:15:41.132534 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:15:41.278244 master-0 kubenswrapper[29252]: I1203 20:15:41.278149 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"eee0d023-d1ab-4c75-9a92-3a0e42d05168","Type":"ContainerStarted","Data":"bdf1a5022d494663c447e5c37888568b0c85dc5583346ff17d69da033cbe6f52"} Dec 03 20:15:41.278244 master-0 kubenswrapper[29252]: I1203 20:15:41.278206 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"eee0d023-d1ab-4c75-9a92-3a0e42d05168","Type":"ContainerStarted","Data":"03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2"} Dec 03 20:15:41.314563 master-0 kubenswrapper[29252]: I1203 20:15:41.314471 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:15:41.822888 master-0 kubenswrapper[29252]: I1203 20:15:41.822817 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:15:41.822888 master-0 kubenswrapper[29252]: I1203 20:15:41.822883 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:15:42.359661 master-0 kubenswrapper[29252]: I1203 20:15:42.359595 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=8.359579865 podStartE2EDuration="8.359579865s" podCreationTimestamp="2025-12-03 20:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:15:42.357484614 +0000 UTC m=+377.171029567" watchObservedRunningTime="2025-12-03 20:15:42.359579865 +0000 UTC m=+377.173124818" Dec 03 20:15:46.009276 master-0 kubenswrapper[29252]: I1203 20:15:46.009176 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:15:46.009276 master-0 kubenswrapper[29252]: I1203 20:15:46.009281 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:15:47.656950 master-0 kubenswrapper[29252]: I1203 20:15:47.656742 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:47.656950 master-0 kubenswrapper[29252]: I1203 20:15:47.656929 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:51.823487 master-0 kubenswrapper[29252]: I1203 20:15:51.823397 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:15:51.823487 master-0 kubenswrapper[29252]: I1203 20:15:51.823475 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:15:52.403616 master-0 kubenswrapper[29252]: I1203 20:15:52.403541 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:15:52.404626 master-0 kubenswrapper[29252]: I1203 20:15:52.404074 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver" containerID="cri-o://964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2" gracePeriod=15 Dec 03 20:15:52.404626 master-0 kubenswrapper[29252]: I1203 20:15:52.404130 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-check-endpoints" containerID="cri-o://119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137" gracePeriod=15 Dec 03 20:15:52.404626 master-0 kubenswrapper[29252]: I1203 20:15:52.404188 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717" gracePeriod=15 Dec 03 20:15:52.404626 master-0 kubenswrapper[29252]: I1203 20:15:52.404225 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-syncer" containerID="cri-o://545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9" gracePeriod=15 Dec 03 20:15:52.404626 master-0 kubenswrapper[29252]: I1203 20:15:52.404213 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f" gracePeriod=15 Dec 03 20:15:52.409606 master-0 kubenswrapper[29252]: I1203 20:15:52.409373 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:15:52.409901 master-0 kubenswrapper[29252]: E1203 20:15:52.409861 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-insecure-readyz" Dec 03 20:15:52.409901 master-0 kubenswrapper[29252]: I1203 20:15:52.409894 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-insecure-readyz" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: E1203 20:15:52.409930 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-check-endpoints" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: I1203 20:15:52.409949 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-check-endpoints" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: E1203 20:15:52.409982 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: I1203 20:15:52.410000 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: E1203 20:15:52.410026 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="setup" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: I1203 20:15:52.410038 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="setup" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: E1203 20:15:52.410062 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: I1203 20:15:52.410074 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: E1203 20:15:52.410120 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-syncer" Dec 03 20:15:52.410146 master-0 kubenswrapper[29252]: I1203 20:15:52.410134 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-syncer" Dec 03 20:15:52.412336 master-0 kubenswrapper[29252]: I1203 20:15:52.410359 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-check-endpoints" Dec 03 20:15:52.412336 master-0 kubenswrapper[29252]: I1203 20:15:52.410386 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-syncer" Dec 03 20:15:52.412336 master-0 kubenswrapper[29252]: I1203 20:15:52.410405 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-insecure-readyz" Dec 03 20:15:52.412336 master-0 kubenswrapper[29252]: I1203 20:15:52.410760 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver" Dec 03 20:15:52.412336 master-0 kubenswrapper[29252]: I1203 20:15:52.410840 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="382c2026eb84cf3d7672e1fe1646be64" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 20:15:52.415444 master-0 kubenswrapper[29252]: I1203 20:15:52.415364 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:15:52.417627 master-0 kubenswrapper[29252]: I1203 20:15:52.417537 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.428593 master-0 kubenswrapper[29252]: I1203 20:15:52.428229 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="382c2026eb84cf3d7672e1fe1646be64" podUID="f5aa2d6b41f5e21a89224256dc48af14" Dec 03 20:15:52.505720 master-0 kubenswrapper[29252]: E1203 20:15:52.505605 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.525946 master-0 kubenswrapper[29252]: E1203 20:15:52.525888 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:52.526406 master-0 kubenswrapper[29252]: E1203 20:15:52.526366 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:52.526897 master-0 kubenswrapper[29252]: E1203 20:15:52.526860 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:52.527298 master-0 kubenswrapper[29252]: E1203 20:15:52.527262 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:52.527716 master-0 kubenswrapper[29252]: E1203 20:15:52.527682 29252 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:52.527754 master-0 kubenswrapper[29252]: I1203 20:15:52.527715 29252 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 20:15:52.528373 master-0 kubenswrapper[29252]: E1203 20:15:52.528302 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 03 20:15:52.560840 master-0 kubenswrapper[29252]: I1203 20:15:52.560752 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.561130 master-0 kubenswrapper[29252]: I1203 20:15:52.561060 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.561509 master-0 kubenswrapper[29252]: I1203 20:15:52.561340 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.561509 master-0 kubenswrapper[29252]: I1203 20:15:52.561379 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.561509 master-0 kubenswrapper[29252]: I1203 20:15:52.561427 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.561509 master-0 kubenswrapper[29252]: I1203 20:15:52.561455 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.561509 master-0 kubenswrapper[29252]: I1203 20:15:52.561485 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.562241 master-0 kubenswrapper[29252]: I1203 20:15:52.561534 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.663430 master-0 kubenswrapper[29252]: I1203 20:15:52.663273 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663450 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663456 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663557 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663618 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663685 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663734 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663828 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663843 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663884 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663925 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.663955 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.664024 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.664081 master-0 kubenswrapper[29252]: I1203 20:15:52.664095 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.665216 master-0 kubenswrapper[29252]: I1203 20:15:52.664193 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.665216 master-0 kubenswrapper[29252]: I1203 20:15:52.664273 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f5aa2d6b41f5e21a89224256dc48af14-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"f5aa2d6b41f5e21a89224256dc48af14\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:52.730642 master-0 kubenswrapper[29252]: E1203 20:15:52.730519 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 03 20:15:52.807289 master-0 kubenswrapper[29252]: I1203 20:15:52.807217 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:52.832195 master-0 kubenswrapper[29252]: E1203 20:15:52.832015 29252 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.187dcdd71f9c7fd3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:c98a8d85d3901d33f6fe192bdc7172aa,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 20:15:52.831115219 +0000 UTC m=+387.644660172,LastTimestamp:2025-12-03 20:15:52.831115219 +0000 UTC m=+387.644660172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:15:53.132592 master-0 kubenswrapper[29252]: E1203 20:15:53.132544 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 03 20:15:53.433746 master-0 kubenswrapper[29252]: I1203 20:15:53.433686 29252 generic.go:334] "Generic (PLEG): container finished" podID="108176a9-101d-4204-8ed3-4ed41ccdaae0" containerID="bb9066638a39b91f2bc07e934233abcb8b6f75e527303c906c13e755c9e51aee" exitCode=0 Dec 03 20:15:53.434157 master-0 kubenswrapper[29252]: I1203 20:15:53.433927 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"108176a9-101d-4204-8ed3-4ed41ccdaae0","Type":"ContainerDied","Data":"bb9066638a39b91f2bc07e934233abcb8b6f75e527303c906c13e755c9e51aee"} Dec 03 20:15:53.435637 master-0 kubenswrapper[29252]: I1203 20:15:53.435565 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:53.436552 master-0 kubenswrapper[29252]: I1203 20:15:53.436523 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"c98a8d85d3901d33f6fe192bdc7172aa","Type":"ContainerStarted","Data":"0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915"} Dec 03 20:15:53.436654 master-0 kubenswrapper[29252]: I1203 20:15:53.436640 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"c98a8d85d3901d33f6fe192bdc7172aa","Type":"ContainerStarted","Data":"a1c8e9d518c6e2341173796e5864ccfab2ef4750131dfbdc7c1e714993a20ea6"} Dec 03 20:15:53.437996 master-0 kubenswrapper[29252]: E1203 20:15:53.437934 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:53.438064 master-0 kubenswrapper[29252]: I1203 20:15:53.437960 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:53.440738 master-0 kubenswrapper[29252]: I1203 20:15:53.440716 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_382c2026eb84cf3d7672e1fe1646be64/kube-apiserver-cert-syncer/0.log" Dec 03 20:15:53.442094 master-0 kubenswrapper[29252]: I1203 20:15:53.442072 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137" exitCode=0 Dec 03 20:15:53.442203 master-0 kubenswrapper[29252]: I1203 20:15:53.442179 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f" exitCode=0 Dec 03 20:15:53.442281 master-0 kubenswrapper[29252]: I1203 20:15:53.442269 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717" exitCode=0 Dec 03 20:15:53.442344 master-0 kubenswrapper[29252]: I1203 20:15:53.442332 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9" exitCode=2 Dec 03 20:15:53.934387 master-0 kubenswrapper[29252]: E1203 20:15:53.934281 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 03 20:15:54.454949 master-0 kubenswrapper[29252]: E1203 20:15:54.454830 29252 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:15:54.945482 master-0 kubenswrapper[29252]: I1203 20:15:54.945415 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_382c2026eb84cf3d7672e1fe1646be64/kube-apiserver-cert-syncer/0.log" Dec 03 20:15:54.946945 master-0 kubenswrapper[29252]: I1203 20:15:54.946899 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:54.948256 master-0 kubenswrapper[29252]: I1203 20:15:54.948168 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:54.948256 master-0 kubenswrapper[29252]: I1203 20:15:54.948310 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:15:54.949312 master-0 kubenswrapper[29252]: I1203 20:15:54.949234 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:54.950351 master-0 kubenswrapper[29252]: I1203 20:15:54.950277 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:54.951148 master-0 kubenswrapper[29252]: I1203 20:15:54.951081 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.038621 master-0 kubenswrapper[29252]: I1203 20:15:55.038516 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir\") pod \"108176a9-101d-4204-8ed3-4ed41ccdaae0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " Dec 03 20:15:55.038621 master-0 kubenswrapper[29252]: I1203 20:15:55.038623 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock\") pod \"108176a9-101d-4204-8ed3-4ed41ccdaae0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038628 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "108176a9-101d-4204-8ed3-4ed41ccdaae0" (UID: "108176a9-101d-4204-8ed3-4ed41ccdaae0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038677 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock" (OuterVolumeSpecName: "var-lock") pod "108176a9-101d-4204-8ed3-4ed41ccdaae0" (UID: "108176a9-101d-4204-8ed3-4ed41ccdaae0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038710 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir\") pod \"382c2026eb84cf3d7672e1fe1646be64\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038743 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir\") pod \"382c2026eb84cf3d7672e1fe1646be64\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038831 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "382c2026eb84cf3d7672e1fe1646be64" (UID: "382c2026eb84cf3d7672e1fe1646be64"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038849 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access\") pod \"108176a9-101d-4204-8ed3-4ed41ccdaae0\" (UID: \"108176a9-101d-4204-8ed3-4ed41ccdaae0\") " Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.038976 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "382c2026eb84cf3d7672e1fe1646be64" (UID: "382c2026eb84cf3d7672e1fe1646be64"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:55.039029 master-0 kubenswrapper[29252]: I1203 20:15:55.039021 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir\") pod \"382c2026eb84cf3d7672e1fe1646be64\" (UID: \"382c2026eb84cf3d7672e1fe1646be64\") " Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039114 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "382c2026eb84cf3d7672e1fe1646be64" (UID: "382c2026eb84cf3d7672e1fe1646be64"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039514 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039539 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039557 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/108176a9-101d-4204-8ed3-4ed41ccdaae0-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039573 29252 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.039696 master-0 kubenswrapper[29252]: I1203 20:15:55.039589 29252 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/382c2026eb84cf3d7672e1fe1646be64-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.043401 master-0 kubenswrapper[29252]: I1203 20:15:55.043351 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "108176a9-101d-4204-8ed3-4ed41ccdaae0" (UID: "108176a9-101d-4204-8ed3-4ed41ccdaae0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:15:55.141761 master-0 kubenswrapper[29252]: I1203 20:15:55.141663 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/108176a9-101d-4204-8ed3-4ed41ccdaae0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:15:55.424072 master-0 kubenswrapper[29252]: I1203 20:15:55.423958 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.424484 master-0 kubenswrapper[29252]: I1203 20:15:55.424464 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.427285 master-0 kubenswrapper[29252]: I1203 20:15:55.427243 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382c2026eb84cf3d7672e1fe1646be64" path="/var/lib/kubelet/pods/382c2026eb84cf3d7672e1fe1646be64/volumes" Dec 03 20:15:55.463477 master-0 kubenswrapper[29252]: I1203 20:15:55.463237 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_382c2026eb84cf3d7672e1fe1646be64/kube-apiserver-cert-syncer/0.log" Dec 03 20:15:55.464441 master-0 kubenswrapper[29252]: I1203 20:15:55.464384 29252 generic.go:334] "Generic (PLEG): container finished" podID="382c2026eb84cf3d7672e1fe1646be64" containerID="964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2" exitCode=0 Dec 03 20:15:55.464564 master-0 kubenswrapper[29252]: I1203 20:15:55.464459 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:15:55.464564 master-0 kubenswrapper[29252]: I1203 20:15:55.464527 29252 scope.go:117] "RemoveContainer" containerID="119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137" Dec 03 20:15:55.466934 master-0 kubenswrapper[29252]: I1203 20:15:55.465865 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.466934 master-0 kubenswrapper[29252]: I1203 20:15:55.466803 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"108176a9-101d-4204-8ed3-4ed41ccdaae0","Type":"ContainerDied","Data":"16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956"} Dec 03 20:15:55.466934 master-0 kubenswrapper[29252]: I1203 20:15:55.466834 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16fdc6b40014b79b88a4feddf0d138a0e52af227df44c76bb17b06c77df24956" Dec 03 20:15:55.466934 master-0 kubenswrapper[29252]: I1203 20:15:55.466827 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.466934 master-0 kubenswrapper[29252]: I1203 20:15:55.466879 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 20:15:55.473118 master-0 kubenswrapper[29252]: I1203 20:15:55.473032 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.490537 master-0 kubenswrapper[29252]: I1203 20:15:55.490476 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.492487 master-0 kubenswrapper[29252]: I1203 20:15:55.492386 29252 status_manager.go:851] "Failed to get status for pod" podUID="382c2026eb84cf3d7672e1fe1646be64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.492936 master-0 kubenswrapper[29252]: I1203 20:15:55.492884 29252 scope.go:117] "RemoveContainer" containerID="e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f" Dec 03 20:15:55.493332 master-0 kubenswrapper[29252]: I1203 20:15:55.493281 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:15:55.512412 master-0 kubenswrapper[29252]: I1203 20:15:55.512360 29252 scope.go:117] "RemoveContainer" containerID="ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717" Dec 03 20:15:55.531088 master-0 kubenswrapper[29252]: I1203 20:15:55.531050 29252 scope.go:117] "RemoveContainer" containerID="545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9" Dec 03 20:15:55.536237 master-0 kubenswrapper[29252]: E1203 20:15:55.536185 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 03 20:15:55.549304 master-0 kubenswrapper[29252]: I1203 20:15:55.549267 29252 scope.go:117] "RemoveContainer" containerID="964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2" Dec 03 20:15:55.567097 master-0 kubenswrapper[29252]: I1203 20:15:55.567057 29252 scope.go:117] "RemoveContainer" containerID="875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b" Dec 03 20:15:55.598678 master-0 kubenswrapper[29252]: I1203 20:15:55.598650 29252 scope.go:117] "RemoveContainer" containerID="119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137" Dec 03 20:15:55.599293 master-0 kubenswrapper[29252]: E1203 20:15:55.599242 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137\": container with ID starting with 119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137 not found: ID does not exist" containerID="119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137" Dec 03 20:15:55.599419 master-0 kubenswrapper[29252]: I1203 20:15:55.599376 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137"} err="failed to get container status \"119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137\": rpc error: code = NotFound desc = could not find container \"119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137\": container with ID starting with 119d4b6b3038dbead7fe6b9e33314320ced1c64886e5d19a0fc0f37d99ce9137 not found: ID does not exist" Dec 03 20:15:55.599491 master-0 kubenswrapper[29252]: I1203 20:15:55.599480 29252 scope.go:117] "RemoveContainer" containerID="e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f" Dec 03 20:15:55.599954 master-0 kubenswrapper[29252]: E1203 20:15:55.599930 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f\": container with ID starting with e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f not found: ID does not exist" containerID="e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f" Dec 03 20:15:55.600057 master-0 kubenswrapper[29252]: I1203 20:15:55.599954 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f"} err="failed to get container status \"e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f\": rpc error: code = NotFound desc = could not find container \"e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f\": container with ID starting with e308a447fef7af8ae0711fb67451ec5d981dbb028a918b0a38f448e0927e409f not found: ID does not exist" Dec 03 20:15:55.600057 master-0 kubenswrapper[29252]: I1203 20:15:55.599970 29252 scope.go:117] "RemoveContainer" containerID="ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717" Dec 03 20:15:55.600915 master-0 kubenswrapper[29252]: E1203 20:15:55.600850 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717\": container with ID starting with ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717 not found: ID does not exist" containerID="ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717" Dec 03 20:15:55.600915 master-0 kubenswrapper[29252]: I1203 20:15:55.600877 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717"} err="failed to get container status \"ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717\": rpc error: code = NotFound desc = could not find container \"ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717\": container with ID starting with ef71ce2641c94253ec625f7906b9b76cac05532370b5b06e58580131a3c97717 not found: ID does not exist" Dec 03 20:15:55.600915 master-0 kubenswrapper[29252]: I1203 20:15:55.600895 29252 scope.go:117] "RemoveContainer" containerID="545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9" Dec 03 20:15:55.602677 master-0 kubenswrapper[29252]: E1203 20:15:55.602633 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9\": container with ID starting with 545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9 not found: ID does not exist" containerID="545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9" Dec 03 20:15:55.602817 master-0 kubenswrapper[29252]: I1203 20:15:55.602688 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9"} err="failed to get container status \"545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9\": rpc error: code = NotFound desc = could not find container \"545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9\": container with ID starting with 545a2aa47bbcc1149d070c967862238b01682d831ff51d5729ad98e0bbed88a9 not found: ID does not exist" Dec 03 20:15:55.602817 master-0 kubenswrapper[29252]: I1203 20:15:55.602733 29252 scope.go:117] "RemoveContainer" containerID="964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2" Dec 03 20:15:55.603208 master-0 kubenswrapper[29252]: E1203 20:15:55.603177 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2\": container with ID starting with 964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2 not found: ID does not exist" containerID="964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2" Dec 03 20:15:55.603282 master-0 kubenswrapper[29252]: I1203 20:15:55.603207 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2"} err="failed to get container status \"964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2\": rpc error: code = NotFound desc = could not find container \"964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2\": container with ID starting with 964cf291812143ff2c57dd42685f00b48fdffd3b9344956cd88cdb00e2a88dd2 not found: ID does not exist" Dec 03 20:15:55.603282 master-0 kubenswrapper[29252]: I1203 20:15:55.603225 29252 scope.go:117] "RemoveContainer" containerID="875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b" Dec 03 20:15:55.604722 master-0 kubenswrapper[29252]: E1203 20:15:55.604627 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b\": container with ID starting with 875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b not found: ID does not exist" containerID="875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b" Dec 03 20:15:55.604722 master-0 kubenswrapper[29252]: I1203 20:15:55.604655 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b"} err="failed to get container status \"875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b\": rpc error: code = NotFound desc = could not find container \"875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b\": container with ID starting with 875c7edfcaad228f07b8db53cf8ddb0aab324d9c88bddb362bf3f034691caa0b not found: ID does not exist" Dec 03 20:15:56.157620 master-0 kubenswrapper[29252]: E1203 20:15:56.157502 29252 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.187dcdd71f9c7fd3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:c98a8d85d3901d33f6fe192bdc7172aa,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 20:15:52.831115219 +0000 UTC m=+387.644660172,LastTimestamp:2025-12-03 20:15:52.831115219 +0000 UTC m=+387.644660172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 20:15:57.657249 master-0 kubenswrapper[29252]: I1203 20:15:57.657153 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:15:57.658211 master-0 kubenswrapper[29252]: I1203 20:15:57.657252 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:15:58.737322 master-0 kubenswrapper[29252]: E1203 20:15:58.737135 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 03 20:16:01.824642 master-0 kubenswrapper[29252]: I1203 20:16:01.824550 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:16:01.825512 master-0 kubenswrapper[29252]: I1203 20:16:01.824646 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:16:02.454038 master-0 kubenswrapper[29252]: E1203 20:16:02.453936 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:16:02Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:16:02Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:16:02Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T20:16:02Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:02.454663 master-0 kubenswrapper[29252]: E1203 20:16:02.454612 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:02.455242 master-0 kubenswrapper[29252]: E1203 20:16:02.455198 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:02.456046 master-0 kubenswrapper[29252]: E1203 20:16:02.455982 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:02.457195 master-0 kubenswrapper[29252]: E1203 20:16:02.457101 29252 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:02.457195 master-0 kubenswrapper[29252]: E1203 20:16:02.457179 29252 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 20:16:04.415924 master-0 kubenswrapper[29252]: I1203 20:16:04.415848 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:04.418087 master-0 kubenswrapper[29252]: I1203 20:16:04.417979 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:04.453916 master-0 kubenswrapper[29252]: I1203 20:16:04.453851 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:04.453916 master-0 kubenswrapper[29252]: I1203 20:16:04.453900 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:04.455104 master-0 kubenswrapper[29252]: E1203 20:16:04.455015 29252 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:04.456122 master-0 kubenswrapper[29252]: I1203 20:16:04.456064 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:04.489293 master-0 kubenswrapper[29252]: W1203 20:16:04.489228 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5aa2d6b41f5e21a89224256dc48af14.slice/crio-9fcc15b2842450da408f591867b09d4d6117b79f919873d4ba3cc5674a721737 WatchSource:0}: Error finding container 9fcc15b2842450da408f591867b09d4d6117b79f919873d4ba3cc5674a721737: Status 404 returned error can't find the container with id 9fcc15b2842450da408f591867b09d4d6117b79f919873d4ba3cc5674a721737 Dec 03 20:16:04.557071 master-0 kubenswrapper[29252]: I1203 20:16:04.556973 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"9fcc15b2842450da408f591867b09d4d6117b79f919873d4ba3cc5674a721737"} Dec 03 20:16:05.139337 master-0 kubenswrapper[29252]: E1203 20:16:05.139231 29252 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Dec 03 20:16:05.425059 master-0 kubenswrapper[29252]: I1203 20:16:05.424844 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.426023 master-0 kubenswrapper[29252]: I1203 20:16:05.425923 29252 status_manager.go:851] "Failed to get status for pod" podUID="f5aa2d6b41f5e21a89224256dc48af14" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.567041 master-0 kubenswrapper[29252]: I1203 20:16:05.566946 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:16:05.569692 master-0 kubenswrapper[29252]: I1203 20:16:05.569531 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/0.log" Dec 03 20:16:05.569692 master-0 kubenswrapper[29252]: I1203 20:16:05.569613 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" exitCode=1 Dec 03 20:16:05.570031 master-0 kubenswrapper[29252]: I1203 20:16:05.569706 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerDied","Data":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} Dec 03 20:16:05.570031 master-0 kubenswrapper[29252]: I1203 20:16:05.569756 29252 scope.go:117] "RemoveContainer" containerID="d2a9fff66cc8aec805af934297108d64fdaa0ffb64bc75c967dcb4742c7e5f5f" Dec 03 20:16:05.570829 master-0 kubenswrapper[29252]: I1203 20:16:05.570689 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:16:05.571486 master-0 kubenswrapper[29252]: E1203 20:16:05.571417 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(6fb0810126310d28fb5532674012978b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" Dec 03 20:16:05.571638 master-0 kubenswrapper[29252]: I1203 20:16:05.571489 29252 status_manager.go:851] "Failed to get status for pod" podUID="f5aa2d6b41f5e21a89224256dc48af14" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.573065 master-0 kubenswrapper[29252]: I1203 20:16:05.572386 29252 status_manager.go:851] "Failed to get status for pod" podUID="6fb0810126310d28fb5532674012978b" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.573065 master-0 kubenswrapper[29252]: I1203 20:16:05.572946 29252 generic.go:334] "Generic (PLEG): container finished" podID="f5aa2d6b41f5e21a89224256dc48af14" containerID="c9d09b067ce9fdaf7166f88dd85c461bd35276146cc1d01c13d10a97ae47fec0" exitCode=0 Dec 03 20:16:05.573065 master-0 kubenswrapper[29252]: I1203 20:16:05.573021 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerDied","Data":"c9d09b067ce9fdaf7166f88dd85c461bd35276146cc1d01c13d10a97ae47fec0"} Dec 03 20:16:05.573389 master-0 kubenswrapper[29252]: I1203 20:16:05.573308 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:05.573389 master-0 kubenswrapper[29252]: I1203 20:16:05.573340 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:05.574434 master-0 kubenswrapper[29252]: I1203 20:16:05.574352 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.574434 master-0 kubenswrapper[29252]: E1203 20:16:05.574402 29252 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:05.575876 master-0 kubenswrapper[29252]: I1203 20:16:05.575823 29252 status_manager.go:851] "Failed to get status for pod" podUID="f5aa2d6b41f5e21a89224256dc48af14" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.576735 master-0 kubenswrapper[29252]: I1203 20:16:05.576635 29252 status_manager.go:851] "Failed to get status for pod" podUID="6fb0810126310d28fb5532674012978b" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:05.577684 master-0 kubenswrapper[29252]: I1203 20:16:05.577571 29252 status_manager.go:851] "Failed to get status for pod" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 20:16:06.598496 master-0 kubenswrapper[29252]: I1203 20:16:06.598394 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"f306c209ad3aeddff9a85ec22dbb8bee6cbfdf9b00d3e08161d95ec1751e31ae"} Dec 03 20:16:06.599041 master-0 kubenswrapper[29252]: I1203 20:16:06.599021 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"2519bd335404a7e6fec0da2d91ae6a89d0da58b956b1ebdf66d1024da861f3ee"} Dec 03 20:16:06.599118 master-0 kubenswrapper[29252]: I1203 20:16:06.599105 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"86f0e419a839363526a1b0bf4edc03f93948858be4ab3281568e14d6d082eab6"} Dec 03 20:16:06.602601 master-0 kubenswrapper[29252]: I1203 20:16:06.602572 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:16:07.616826 master-0 kubenswrapper[29252]: I1203 20:16:07.616093 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"f0a32c46b4a35835fc2aa3569c96f29715e6f6f399503f019896b023b057ed45"} Dec 03 20:16:07.616826 master-0 kubenswrapper[29252]: I1203 20:16:07.616143 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"f5aa2d6b41f5e21a89224256dc48af14","Type":"ContainerStarted","Data":"a2fb645a262ca6827b9be72d362edb1471e9f893ff65758a252368787ee9aad2"} Dec 03 20:16:07.616826 master-0 kubenswrapper[29252]: I1203 20:16:07.616442 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:07.616826 master-0 kubenswrapper[29252]: I1203 20:16:07.616458 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:07.617497 master-0 kubenswrapper[29252]: I1203 20:16:07.617406 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:07.662842 master-0 kubenswrapper[29252]: I1203 20:16:07.661923 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:16:07.662842 master-0 kubenswrapper[29252]: I1203 20:16:07.662012 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:16:09.456424 master-0 kubenswrapper[29252]: I1203 20:16:09.456348 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:09.456424 master-0 kubenswrapper[29252]: I1203 20:16:09.456433 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:09.464755 master-0 kubenswrapper[29252]: I1203 20:16:09.464709 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:10.465963 master-0 kubenswrapper[29252]: I1203 20:16:10.465876 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:10.465963 master-0 kubenswrapper[29252]: I1203 20:16:10.465942 29252 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:10.465963 master-0 kubenswrapper[29252]: I1203 20:16:10.465968 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:10.468074 master-0 kubenswrapper[29252]: I1203 20:16:10.466965 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:16:10.468473 master-0 kubenswrapper[29252]: E1203 20:16:10.468382 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(6fb0810126310d28fb5532674012978b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" Dec 03 20:16:11.823444 master-0 kubenswrapper[29252]: I1203 20:16:11.823344 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:16:11.823444 master-0 kubenswrapper[29252]: I1203 20:16:11.823427 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:16:12.282192 master-0 kubenswrapper[29252]: E1203 20:16:12.282136 29252 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podeee0d023_d1ab_4c75_9a92_3a0e42d05168.slice/crio-bdf1a5022d494663c447e5c37888568b0c85dc5583346ff17d69da033cbe6f52.scope\": RecentStats: unable to find data in memory cache]" Dec 03 20:16:12.628531 master-0 kubenswrapper[29252]: I1203 20:16:12.628453 29252 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:12.651309 master-0 kubenswrapper[29252]: I1203 20:16:12.651238 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="f5aa2d6b41f5e21a89224256dc48af14" podUID="c60659ad-91c0-40d1-b49e-113c9b624e64" Dec 03 20:16:12.674476 master-0 kubenswrapper[29252]: I1203 20:16:12.674431 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_eee0d023-d1ab-4c75-9a92-3a0e42d05168/installer/0.log" Dec 03 20:16:12.674697 master-0 kubenswrapper[29252]: I1203 20:16:12.674483 29252 generic.go:334] "Generic (PLEG): container finished" podID="eee0d023-d1ab-4c75-9a92-3a0e42d05168" containerID="bdf1a5022d494663c447e5c37888568b0c85dc5583346ff17d69da033cbe6f52" exitCode=1 Dec 03 20:16:12.674697 master-0 kubenswrapper[29252]: I1203 20:16:12.674579 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"eee0d023-d1ab-4c75-9a92-3a0e42d05168","Type":"ContainerDied","Data":"bdf1a5022d494663c447e5c37888568b0c85dc5583346ff17d69da033cbe6f52"} Dec 03 20:16:12.674874 master-0 kubenswrapper[29252]: I1203 20:16:12.674770 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:12.674874 master-0 kubenswrapper[29252]: I1203 20:16:12.674804 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:12.680219 master-0 kubenswrapper[29252]: I1203 20:16:12.680163 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:12.712343 master-0 kubenswrapper[29252]: I1203 20:16:12.712272 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="f5aa2d6b41f5e21a89224256dc48af14" podUID="c60659ad-91c0-40d1-b49e-113c9b624e64" Dec 03 20:16:13.685417 master-0 kubenswrapper[29252]: I1203 20:16:13.685334 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:13.685417 master-0 kubenswrapper[29252]: I1203 20:16:13.685392 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="06299bca-776c-487d-b578-c712c1a65372" Dec 03 20:16:13.688893 master-0 kubenswrapper[29252]: I1203 20:16:13.688839 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="f5aa2d6b41f5e21a89224256dc48af14" podUID="c60659ad-91c0-40d1-b49e-113c9b624e64" Dec 03 20:16:14.165049 master-0 kubenswrapper[29252]: I1203 20:16:14.164983 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_eee0d023-d1ab-4c75-9a92-3a0e42d05168/installer/0.log" Dec 03 20:16:14.165310 master-0 kubenswrapper[29252]: I1203 20:16:14.165063 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:16:14.266238 master-0 kubenswrapper[29252]: I1203 20:16:14.266157 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access\") pod \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " Dec 03 20:16:14.266609 master-0 kubenswrapper[29252]: I1203 20:16:14.266249 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir\") pod \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " Dec 03 20:16:14.266609 master-0 kubenswrapper[29252]: I1203 20:16:14.266335 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock\") pod \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\" (UID: \"eee0d023-d1ab-4c75-9a92-3a0e42d05168\") " Dec 03 20:16:14.266609 master-0 kubenswrapper[29252]: I1203 20:16:14.266433 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eee0d023-d1ab-4c75-9a92-3a0e42d05168" (UID: "eee0d023-d1ab-4c75-9a92-3a0e42d05168"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:14.266972 master-0 kubenswrapper[29252]: I1203 20:16:14.266939 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock" (OuterVolumeSpecName: "var-lock") pod "eee0d023-d1ab-4c75-9a92-3a0e42d05168" (UID: "eee0d023-d1ab-4c75-9a92-3a0e42d05168"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:14.267137 master-0 kubenswrapper[29252]: I1203 20:16:14.267093 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:14.269260 master-0 kubenswrapper[29252]: I1203 20:16:14.269078 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eee0d023-d1ab-4c75-9a92-3a0e42d05168" (UID: "eee0d023-d1ab-4c75-9a92-3a0e42d05168"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:16:14.368360 master-0 kubenswrapper[29252]: I1203 20:16:14.368285 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eee0d023-d1ab-4c75-9a92-3a0e42d05168-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:14.368360 master-0 kubenswrapper[29252]: I1203 20:16:14.368328 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eee0d023-d1ab-4c75-9a92-3a0e42d05168-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:14.697519 master-0 kubenswrapper[29252]: I1203 20:16:14.697356 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_eee0d023-d1ab-4c75-9a92-3a0e42d05168/installer/0.log" Dec 03 20:16:14.697519 master-0 kubenswrapper[29252]: I1203 20:16:14.697437 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"eee0d023-d1ab-4c75-9a92-3a0e42d05168","Type":"ContainerDied","Data":"03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2"} Dec 03 20:16:14.697519 master-0 kubenswrapper[29252]: I1203 20:16:14.697473 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03053c1707148ce58497905e1f695c64a701c5b906f823e5263d19d69e641fd2" Dec 03 20:16:14.699073 master-0 kubenswrapper[29252]: I1203 20:16:14.697554 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 20:16:16.009619 master-0 kubenswrapper[29252]: I1203 20:16:16.009471 29252 patch_prober.go:28] interesting pod/machine-config-daemon-7t8bs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 20:16:16.011021 master-0 kubenswrapper[29252]: I1203 20:16:16.009584 29252 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7t8bs" podUID="9891cf64-59e8-4d8d-94fe-17cfa4b18c1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 20:16:17.657075 master-0 kubenswrapper[29252]: I1203 20:16:17.656906 29252 patch_prober.go:28] interesting pod/console-65c74dc56f-mlqjw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Dec 03 20:16:17.658216 master-0 kubenswrapper[29252]: I1203 20:16:17.657083 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Dec 03 20:16:21.822607 master-0 kubenswrapper[29252]: I1203 20:16:21.822533 29252 patch_prober.go:28] interesting pod/console-6465b775c-7mmtn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" start-of-body= Dec 03 20:16:21.823178 master-0 kubenswrapper[29252]: I1203 20:16:21.822629 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.108:8443/health\": dial tcp 10.128.0.108:8443: connect: connection refused" Dec 03 20:16:22.416723 master-0 kubenswrapper[29252]: I1203 20:16:22.416633 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:16:22.781330 master-0 kubenswrapper[29252]: I1203 20:16:22.781266 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:16:23.683577 master-0 kubenswrapper[29252]: I1203 20:16:23.683494 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 20:16:23.730012 master-0 kubenswrapper[29252]: I1203 20:16:23.729929 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 20:16:23.791450 master-0 kubenswrapper[29252]: I1203 20:16:23.791351 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:16:23.792710 master-0 kubenswrapper[29252]: I1203 20:16:23.792651 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"6fb0810126310d28fb5532674012978b","Type":"ContainerStarted","Data":"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046"} Dec 03 20:16:24.090329 master-0 kubenswrapper[29252]: I1203 20:16:24.090165 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 20:16:24.091081 master-0 kubenswrapper[29252]: I1203 20:16:24.091021 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 20:16:24.237705 master-0 kubenswrapper[29252]: I1203 20:16:24.237632 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 20:16:24.251415 master-0 kubenswrapper[29252]: I1203 20:16:24.251356 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 20:16:24.353514 master-0 kubenswrapper[29252]: I1203 20:16:24.353363 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 20:16:24.587435 master-0 kubenswrapper[29252]: I1203 20:16:24.587375 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 20:16:24.673008 master-0 kubenswrapper[29252]: I1203 20:16:24.672919 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 20:16:24.715456 master-0 kubenswrapper[29252]: I1203 20:16:24.715390 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 20:16:24.760185 master-0 kubenswrapper[29252]: I1203 20:16:24.760047 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 20:16:24.925357 master-0 kubenswrapper[29252]: I1203 20:16:24.925150 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 20:16:24.965642 master-0 kubenswrapper[29252]: I1203 20:16:24.965540 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4vlhrftdrf07t" Dec 03 20:16:24.991464 master-0 kubenswrapper[29252]: I1203 20:16:24.991365 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-482c4" Dec 03 20:16:25.326264 master-0 kubenswrapper[29252]: I1203 20:16:25.326065 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 20:16:25.365042 master-0 kubenswrapper[29252]: I1203 20:16:25.364980 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 20:16:25.428365 master-0 kubenswrapper[29252]: I1203 20:16:25.428298 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 20:16:25.430632 master-0 kubenswrapper[29252]: I1203 20:16:25.430590 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 20:16:25.676372 master-0 kubenswrapper[29252]: I1203 20:16:25.676284 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 20:16:25.873077 master-0 kubenswrapper[29252]: I1203 20:16:25.872989 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 20:16:25.963052 master-0 kubenswrapper[29252]: I1203 20:16:25.962922 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 20:16:25.972610 master-0 kubenswrapper[29252]: I1203 20:16:25.972554 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 20:16:25.999962 master-0 kubenswrapper[29252]: I1203 20:16:25.999878 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-brhmz" Dec 03 20:16:26.026763 master-0 kubenswrapper[29252]: I1203 20:16:26.026669 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 20:16:26.081367 master-0 kubenswrapper[29252]: I1203 20:16:26.081287 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 20:16:26.099604 master-0 kubenswrapper[29252]: I1203 20:16:26.097708 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-6kl7k" Dec 03 20:16:26.155159 master-0 kubenswrapper[29252]: I1203 20:16:26.155085 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 03 20:16:26.202344 master-0 kubenswrapper[29252]: I1203 20:16:26.202274 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 20:16:26.284317 master-0 kubenswrapper[29252]: I1203 20:16:26.284204 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 20:16:26.308614 master-0 kubenswrapper[29252]: I1203 20:16:26.308555 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 20:16:26.347407 master-0 kubenswrapper[29252]: I1203 20:16:26.347360 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 20:16:26.436685 master-0 kubenswrapper[29252]: I1203 20:16:26.436617 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 20:16:26.469014 master-0 kubenswrapper[29252]: I1203 20:16:26.468948 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 20:16:26.513060 master-0 kubenswrapper[29252]: I1203 20:16:26.513011 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 20:16:26.570122 master-0 kubenswrapper[29252]: I1203 20:16:26.569891 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 20:16:26.576144 master-0 kubenswrapper[29252]: I1203 20:16:26.576092 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 20:16:26.624153 master-0 kubenswrapper[29252]: I1203 20:16:26.622006 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 20:16:26.639656 master-0 kubenswrapper[29252]: I1203 20:16:26.639601 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 20:16:26.647923 master-0 kubenswrapper[29252]: I1203 20:16:26.647864 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 20:16:26.687806 master-0 kubenswrapper[29252]: I1203 20:16:26.686938 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 20:16:26.739165 master-0 kubenswrapper[29252]: I1203 20:16:26.739090 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 20:16:26.859425 master-0 kubenswrapper[29252]: I1203 20:16:26.859371 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 20:16:26.913234 master-0 kubenswrapper[29252]: I1203 20:16:26.913153 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 20:16:26.995146 master-0 kubenswrapper[29252]: I1203 20:16:26.995052 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-7m7l6" Dec 03 20:16:26.997941 master-0 kubenswrapper[29252]: I1203 20:16:26.997895 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 20:16:27.001565 master-0 kubenswrapper[29252]: I1203 20:16:27.001512 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 20:16:27.049888 master-0 kubenswrapper[29252]: I1203 20:16:27.049821 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 03 20:16:27.271465 master-0 kubenswrapper[29252]: I1203 20:16:27.271340 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 20:16:27.317354 master-0 kubenswrapper[29252]: I1203 20:16:27.317276 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 20:16:27.407289 master-0 kubenswrapper[29252]: I1203 20:16:27.407212 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 20:16:27.496839 master-0 kubenswrapper[29252]: I1203 20:16:27.496731 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 20:16:27.612551 master-0 kubenswrapper[29252]: I1203 20:16:27.611340 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 20:16:27.661394 master-0 kubenswrapper[29252]: I1203 20:16:27.661322 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:16:27.666074 master-0 kubenswrapper[29252]: I1203 20:16:27.666014 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:16:27.792585 master-0 kubenswrapper[29252]: I1203 20:16:27.792518 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 20:16:27.887247 master-0 kubenswrapper[29252]: I1203 20:16:27.887111 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 20:16:27.915567 master-0 kubenswrapper[29252]: I1203 20:16:27.915508 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 20:16:27.917488 master-0 kubenswrapper[29252]: I1203 20:16:27.917454 29252 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 20:16:28.050636 master-0 kubenswrapper[29252]: I1203 20:16:28.050573 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 20:16:28.067403 master-0 kubenswrapper[29252]: I1203 20:16:28.067360 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 20:16:28.096314 master-0 kubenswrapper[29252]: I1203 20:16:28.096261 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 20:16:28.187031 master-0 kubenswrapper[29252]: I1203 20:16:28.186890 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-7tjv7" Dec 03 20:16:28.213046 master-0 kubenswrapper[29252]: I1203 20:16:28.212983 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 20:16:28.220549 master-0 kubenswrapper[29252]: I1203 20:16:28.220503 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 20:16:28.243272 master-0 kubenswrapper[29252]: I1203 20:16:28.243223 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 20:16:28.262969 master-0 kubenswrapper[29252]: I1203 20:16:28.262925 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 20:16:28.278852 master-0 kubenswrapper[29252]: I1203 20:16:28.278809 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 20:16:28.345202 master-0 kubenswrapper[29252]: I1203 20:16:28.345128 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 20:16:28.376261 master-0 kubenswrapper[29252]: I1203 20:16:28.376139 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:16:28.490643 master-0 kubenswrapper[29252]: I1203 20:16:28.490504 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 20:16:28.533517 master-0 kubenswrapper[29252]: I1203 20:16:28.533464 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 20:16:28.563476 master-0 kubenswrapper[29252]: I1203 20:16:28.563399 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 20:16:28.578691 master-0 kubenswrapper[29252]: I1203 20:16:28.578633 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 03 20:16:28.700654 master-0 kubenswrapper[29252]: I1203 20:16:28.700597 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 20:16:28.762422 master-0 kubenswrapper[29252]: I1203 20:16:28.762311 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2bcfr" Dec 03 20:16:28.814731 master-0 kubenswrapper[29252]: I1203 20:16:28.814649 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 03 20:16:28.910002 master-0 kubenswrapper[29252]: I1203 20:16:28.909843 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 20:16:29.029630 master-0 kubenswrapper[29252]: I1203 20:16:29.029501 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 20:16:29.033352 master-0 kubenswrapper[29252]: I1203 20:16:29.033324 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 20:16:29.043226 master-0 kubenswrapper[29252]: I1203 20:16:29.043150 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-ztlqb" Dec 03 20:16:29.090943 master-0 kubenswrapper[29252]: I1203 20:16:29.090880 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 20:16:29.124128 master-0 kubenswrapper[29252]: I1203 20:16:29.124073 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 20:16:29.200021 master-0 kubenswrapper[29252]: I1203 20:16:29.199970 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 20:16:29.200486 master-0 kubenswrapper[29252]: I1203 20:16:29.200436 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 20:16:29.247418 master-0 kubenswrapper[29252]: I1203 20:16:29.247323 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 20:16:29.292472 master-0 kubenswrapper[29252]: I1203 20:16:29.292292 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 20:16:29.313317 master-0 kubenswrapper[29252]: I1203 20:16:29.313250 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 20:16:29.320963 master-0 kubenswrapper[29252]: I1203 20:16:29.320859 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 20:16:29.348589 master-0 kubenswrapper[29252]: I1203 20:16:29.348554 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 20:16:29.372593 master-0 kubenswrapper[29252]: I1203 20:16:29.372500 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 20:16:29.382233 master-0 kubenswrapper[29252]: I1203 20:16:29.382173 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 20:16:29.441240 master-0 kubenswrapper[29252]: I1203 20:16:29.441152 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 20:16:29.489912 master-0 kubenswrapper[29252]: I1203 20:16:29.489820 29252 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 20:16:29.498675 master-0 kubenswrapper[29252]: I1203 20:16:29.498599 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 20:16:29.503019 master-0 kubenswrapper[29252]: I1203 20:16:29.502945 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 20:16:29.524323 master-0 kubenswrapper[29252]: I1203 20:16:29.524214 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 20:16:29.528863 master-0 kubenswrapper[29252]: I1203 20:16:29.528750 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 20:16:29.557614 master-0 kubenswrapper[29252]: I1203 20:16:29.557410 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 20:16:29.692315 master-0 kubenswrapper[29252]: I1203 20:16:29.692255 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 20:16:29.742471 master-0 kubenswrapper[29252]: I1203 20:16:29.742413 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 20:16:29.767852 master-0 kubenswrapper[29252]: I1203 20:16:29.767806 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 20:16:29.784503 master-0 kubenswrapper[29252]: I1203 20:16:29.784444 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 20:16:29.785367 master-0 kubenswrapper[29252]: I1203 20:16:29.785334 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 20:16:29.824370 master-0 kubenswrapper[29252]: I1203 20:16:29.824225 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 20:16:29.827860 master-0 kubenswrapper[29252]: I1203 20:16:29.827815 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 20:16:29.861682 master-0 kubenswrapper[29252]: I1203 20:16:29.861617 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 20:16:29.938162 master-0 kubenswrapper[29252]: I1203 20:16:29.938076 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 20:16:29.969575 master-0 kubenswrapper[29252]: I1203 20:16:29.969455 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 20:16:29.989708 master-0 kubenswrapper[29252]: I1203 20:16:29.989629 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 20:16:30.028342 master-0 kubenswrapper[29252]: I1203 20:16:30.028279 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 20:16:30.044977 master-0 kubenswrapper[29252]: I1203 20:16:30.044908 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 20:16:30.131256 master-0 kubenswrapper[29252]: I1203 20:16:30.131040 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-hswpp" Dec 03 20:16:30.299003 master-0 kubenswrapper[29252]: I1203 20:16:30.298924 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 20:16:30.465488 master-0 kubenswrapper[29252]: I1203 20:16:30.465305 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:30.466134 master-0 kubenswrapper[29252]: I1203 20:16:30.466070 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:30.476058 master-0 kubenswrapper[29252]: I1203 20:16:30.476004 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:30.525048 master-0 kubenswrapper[29252]: I1203 20:16:30.524953 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-gcvgk" Dec 03 20:16:30.594648 master-0 kubenswrapper[29252]: I1203 20:16:30.594572 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 20:16:30.650109 master-0 kubenswrapper[29252]: I1203 20:16:30.650052 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 20:16:30.650371 master-0 kubenswrapper[29252]: I1203 20:16:30.650052 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 20:16:30.739624 master-0 kubenswrapper[29252]: I1203 20:16:30.739473 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 20:16:30.766243 master-0 kubenswrapper[29252]: I1203 20:16:30.766171 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 20:16:30.833388 master-0 kubenswrapper[29252]: I1203 20:16:30.833278 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 20:16:30.851351 master-0 kubenswrapper[29252]: I1203 20:16:30.851252 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 20:16:30.858577 master-0 kubenswrapper[29252]: I1203 20:16:30.858502 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 20:16:30.915539 master-0 kubenswrapper[29252]: I1203 20:16:30.915439 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 20:16:30.931381 master-0 kubenswrapper[29252]: I1203 20:16:30.931317 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 20:16:30.978502 master-0 kubenswrapper[29252]: I1203 20:16:30.978428 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 20:16:31.012156 master-0 kubenswrapper[29252]: I1203 20:16:31.011971 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 03 20:16:31.021436 master-0 kubenswrapper[29252]: I1203 20:16:31.021385 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:16:31.032445 master-0 kubenswrapper[29252]: I1203 20:16:31.032369 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 20:16:31.039144 master-0 kubenswrapper[29252]: I1203 20:16:31.039107 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 20:16:31.132935 master-0 kubenswrapper[29252]: I1203 20:16:31.132873 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 20:16:31.184133 master-0 kubenswrapper[29252]: I1203 20:16:31.184091 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 20:16:31.188005 master-0 kubenswrapper[29252]: I1203 20:16:31.187958 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-mx9dn" Dec 03 20:16:31.300257 master-0 kubenswrapper[29252]: I1203 20:16:31.300136 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 20:16:31.456877 master-0 kubenswrapper[29252]: I1203 20:16:31.456821 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 20:16:31.577734 master-0 kubenswrapper[29252]: I1203 20:16:31.577586 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 20:16:31.616401 master-0 kubenswrapper[29252]: I1203 20:16:31.616334 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 20:16:31.632119 master-0 kubenswrapper[29252]: I1203 20:16:31.632080 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-5f72m" Dec 03 20:16:31.637062 master-0 kubenswrapper[29252]: I1203 20:16:31.637038 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 20:16:31.637642 master-0 kubenswrapper[29252]: I1203 20:16:31.637620 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 03 20:16:31.638541 master-0 kubenswrapper[29252]: I1203 20:16:31.638490 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-mdqv5" Dec 03 20:16:31.659228 master-0 kubenswrapper[29252]: I1203 20:16:31.659183 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 20:16:31.661167 master-0 kubenswrapper[29252]: I1203 20:16:31.661132 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 20:16:31.668026 master-0 kubenswrapper[29252]: I1203 20:16:31.667984 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 20:16:31.765280 master-0 kubenswrapper[29252]: I1203 20:16:31.765233 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 20:16:31.800036 master-0 kubenswrapper[29252]: I1203 20:16:31.799999 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 20:16:31.826655 master-0 kubenswrapper[29252]: I1203 20:16:31.826586 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:16:31.830809 master-0 kubenswrapper[29252]: I1203 20:16:31.830722 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:16:31.831604 master-0 kubenswrapper[29252]: I1203 20:16:31.831558 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 20:16:31.892829 master-0 kubenswrapper[29252]: I1203 20:16:31.892754 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 20:16:31.922008 master-0 kubenswrapper[29252]: I1203 20:16:31.921933 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 20:16:31.947436 master-0 kubenswrapper[29252]: I1203 20:16:31.947384 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 20:16:31.983729 master-0 kubenswrapper[29252]: I1203 20:16:31.983667 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 20:16:32.020222 master-0 kubenswrapper[29252]: I1203 20:16:32.020146 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 20:16:32.033833 master-0 kubenswrapper[29252]: I1203 20:16:32.033720 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 20:16:32.060964 master-0 kubenswrapper[29252]: I1203 20:16:32.060912 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 20:16:32.081941 master-0 kubenswrapper[29252]: I1203 20:16:32.081836 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 20:16:32.082295 master-0 kubenswrapper[29252]: I1203 20:16:32.082262 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 20:16:32.180066 master-0 kubenswrapper[29252]: I1203 20:16:32.179986 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 20:16:32.197665 master-0 kubenswrapper[29252]: I1203 20:16:32.197616 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-s7kpg" Dec 03 20:16:32.261374 master-0 kubenswrapper[29252]: I1203 20:16:32.261277 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 20:16:32.272747 master-0 kubenswrapper[29252]: I1203 20:16:32.272706 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 20:16:32.308971 master-0 kubenswrapper[29252]: I1203 20:16:32.308892 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 20:16:32.310950 master-0 kubenswrapper[29252]: I1203 20:16:32.310906 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 20:16:32.371755 master-0 kubenswrapper[29252]: I1203 20:16:32.371669 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 20:16:32.717299 master-0 kubenswrapper[29252]: I1203 20:16:32.717181 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-4tvlh" Dec 03 20:16:32.718384 master-0 kubenswrapper[29252]: I1203 20:16:32.718354 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 20:16:32.735280 master-0 kubenswrapper[29252]: I1203 20:16:32.735235 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 20:16:32.759388 master-0 kubenswrapper[29252]: I1203 20:16:32.759333 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xwq9l" Dec 03 20:16:32.794897 master-0 kubenswrapper[29252]: I1203 20:16:32.794761 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 20:16:32.798733 master-0 kubenswrapper[29252]: I1203 20:16:32.798662 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-ln9n2" Dec 03 20:16:32.851881 master-0 kubenswrapper[29252]: I1203 20:16:32.851813 29252 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 20:16:32.858517 master-0 kubenswrapper[29252]: I1203 20:16:32.858475 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 20:16:32.860248 master-0 kubenswrapper[29252]: I1203 20:16:32.860192 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:16:32.860248 master-0 kubenswrapper[29252]: I1203 20:16:32.860238 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 20:16:32.865923 master-0 kubenswrapper[29252]: I1203 20:16:32.865869 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 20:16:32.888941 master-0 kubenswrapper[29252]: I1203 20:16:32.888141 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=20.888123857 podStartE2EDuration="20.888123857s" podCreationTimestamp="2025-12-03 20:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:16:32.887218846 +0000 UTC m=+427.700763899" watchObservedRunningTime="2025-12-03 20:16:32.888123857 +0000 UTC m=+427.701668820" Dec 03 20:16:32.912028 master-0 kubenswrapper[29252]: I1203 20:16:32.911962 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ngglc" Dec 03 20:16:32.923969 master-0 kubenswrapper[29252]: I1203 20:16:32.923881 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-6bdg2" Dec 03 20:16:32.975712 master-0 kubenswrapper[29252]: I1203 20:16:32.975576 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 20:16:32.986178 master-0 kubenswrapper[29252]: I1203 20:16:32.986154 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 20:16:32.995861 master-0 kubenswrapper[29252]: I1203 20:16:32.995834 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9xvtd" Dec 03 20:16:33.031987 master-0 kubenswrapper[29252]: I1203 20:16:33.031913 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 20:16:33.064489 master-0 kubenswrapper[29252]: I1203 20:16:33.064423 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 20:16:33.077253 master-0 kubenswrapper[29252]: I1203 20:16:33.077228 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 20:16:33.155224 master-0 kubenswrapper[29252]: I1203 20:16:33.155153 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 20:16:33.183068 master-0 kubenswrapper[29252]: I1203 20:16:33.183021 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 20:16:33.271196 master-0 kubenswrapper[29252]: I1203 20:16:33.271090 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 20:16:33.307734 master-0 kubenswrapper[29252]: I1203 20:16:33.307623 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 20:16:33.321218 master-0 kubenswrapper[29252]: I1203 20:16:33.320757 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 20:16:33.349870 master-0 kubenswrapper[29252]: I1203 20:16:33.349648 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 20:16:33.365962 master-0 kubenswrapper[29252]: I1203 20:16:33.365900 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 20:16:33.379802 master-0 kubenswrapper[29252]: I1203 20:16:33.379713 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 20:16:33.405661 master-0 kubenswrapper[29252]: I1203 20:16:33.405561 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 20:16:33.488377 master-0 kubenswrapper[29252]: I1203 20:16:33.488299 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 20:16:33.490069 master-0 kubenswrapper[29252]: I1203 20:16:33.490003 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 20:16:33.506510 master-0 kubenswrapper[29252]: I1203 20:16:33.506402 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 20:16:33.547485 master-0 kubenswrapper[29252]: I1203 20:16:33.547303 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 20:16:33.600076 master-0 kubenswrapper[29252]: I1203 20:16:33.599976 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 20:16:33.654982 master-0 kubenswrapper[29252]: I1203 20:16:33.654900 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 20:16:33.682765 master-0 kubenswrapper[29252]: I1203 20:16:33.682691 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 20:16:33.705967 master-0 kubenswrapper[29252]: I1203 20:16:33.705873 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 20:16:33.720490 master-0 kubenswrapper[29252]: I1203 20:16:33.720423 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 03 20:16:33.748397 master-0 kubenswrapper[29252]: I1203 20:16:33.748337 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 20:16:33.805105 master-0 kubenswrapper[29252]: I1203 20:16:33.804960 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 20:16:33.826470 master-0 kubenswrapper[29252]: I1203 20:16:33.826420 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 20:16:33.842184 master-0 kubenswrapper[29252]: I1203 20:16:33.842083 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 20:16:33.879625 master-0 kubenswrapper[29252]: I1203 20:16:33.879557 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 20:16:33.923427 master-0 kubenswrapper[29252]: I1203 20:16:33.923313 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 20:16:33.924762 master-0 kubenswrapper[29252]: I1203 20:16:33.924726 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 20:16:33.961581 master-0 kubenswrapper[29252]: I1203 20:16:33.961466 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 20:16:33.999643 master-0 kubenswrapper[29252]: I1203 20:16:33.999585 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 20:16:34.058860 master-0 kubenswrapper[29252]: I1203 20:16:34.058732 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 20:16:34.138382 master-0 kubenswrapper[29252]: I1203 20:16:34.138311 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-4hc9t" Dec 03 20:16:34.162914 master-0 kubenswrapper[29252]: I1203 20:16:34.162076 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 20:16:34.227828 master-0 kubenswrapper[29252]: I1203 20:16:34.227724 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 20:16:34.265029 master-0 kubenswrapper[29252]: I1203 20:16:34.264950 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 20:16:34.309332 master-0 kubenswrapper[29252]: I1203 20:16:34.309208 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 20:16:34.331517 master-0 kubenswrapper[29252]: I1203 20:16:34.331454 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 20:16:34.332291 master-0 kubenswrapper[29252]: I1203 20:16:34.332237 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 20:16:34.339077 master-0 kubenswrapper[29252]: I1203 20:16:34.339026 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 03 20:16:34.386337 master-0 kubenswrapper[29252]: I1203 20:16:34.386253 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 20:16:34.485461 master-0 kubenswrapper[29252]: I1203 20:16:34.485359 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 20:16:34.498429 master-0 kubenswrapper[29252]: I1203 20:16:34.498369 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 20:16:34.536399 master-0 kubenswrapper[29252]: I1203 20:16:34.536289 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 20:16:34.584423 master-0 kubenswrapper[29252]: I1203 20:16:34.584182 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 20:16:34.635711 master-0 kubenswrapper[29252]: I1203 20:16:34.635645 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 20:16:34.638309 master-0 kubenswrapper[29252]: I1203 20:16:34.638263 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 20:16:34.719057 master-0 kubenswrapper[29252]: I1203 20:16:34.718994 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 20:16:34.752669 master-0 kubenswrapper[29252]: I1203 20:16:34.752437 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 20:16:34.831130 master-0 kubenswrapper[29252]: I1203 20:16:34.831017 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 20:16:34.972891 master-0 kubenswrapper[29252]: I1203 20:16:34.972831 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 20:16:34.978285 master-0 kubenswrapper[29252]: I1203 20:16:34.978226 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 20:16:35.010129 master-0 kubenswrapper[29252]: I1203 20:16:35.010065 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 20:16:35.142633 master-0 kubenswrapper[29252]: I1203 20:16:35.142574 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 20:16:35.320075 master-0 kubenswrapper[29252]: I1203 20:16:35.319851 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 20:16:35.321053 master-0 kubenswrapper[29252]: I1203 20:16:35.320278 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="c98a8d85d3901d33f6fe192bdc7172aa" containerName="startup-monitor" containerID="cri-o://0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915" gracePeriod=5 Dec 03 20:16:35.327567 master-0 kubenswrapper[29252]: I1203 20:16:35.327501 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 20:16:35.430768 master-0 kubenswrapper[29252]: I1203 20:16:35.430696 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 20:16:35.438111 master-0 kubenswrapper[29252]: I1203 20:16:35.438044 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 20:16:35.458982 master-0 kubenswrapper[29252]: I1203 20:16:35.458899 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 20:16:35.582892 master-0 kubenswrapper[29252]: I1203 20:16:35.582731 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 20:16:35.583989 master-0 kubenswrapper[29252]: I1203 20:16:35.583949 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 20:16:35.655448 master-0 kubenswrapper[29252]: I1203 20:16:35.655351 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-a6k1coh2n07mf" Dec 03 20:16:35.672714 master-0 kubenswrapper[29252]: I1203 20:16:35.672592 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 20:16:35.905990 master-0 kubenswrapper[29252]: I1203 20:16:35.905683 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 20:16:35.910980 master-0 kubenswrapper[29252]: I1203 20:16:35.909237 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 20:16:35.920046 master-0 kubenswrapper[29252]: I1203 20:16:35.919949 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 20:16:35.926121 master-0 kubenswrapper[29252]: I1203 20:16:35.926058 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 20:16:35.987530 master-0 kubenswrapper[29252]: I1203 20:16:35.987471 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 20:16:35.996461 master-0 kubenswrapper[29252]: I1203 20:16:35.996418 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 20:16:36.036806 master-0 kubenswrapper[29252]: I1203 20:16:36.036731 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:16:36.038581 master-0 kubenswrapper[29252]: I1203 20:16:36.038511 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 20:16:36.056431 master-0 kubenswrapper[29252]: I1203 20:16:36.056330 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 20:16:36.062376 master-0 kubenswrapper[29252]: I1203 20:16:36.062316 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 20:16:36.093725 master-0 kubenswrapper[29252]: I1203 20:16:36.093651 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 20:16:36.136892 master-0 kubenswrapper[29252]: I1203 20:16:36.136769 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 20:16:36.163260 master-0 kubenswrapper[29252]: I1203 20:16:36.163124 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 20:16:36.240850 master-0 kubenswrapper[29252]: I1203 20:16:36.240729 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 20:16:36.313350 master-0 kubenswrapper[29252]: I1203 20:16:36.313248 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 20:16:36.318602 master-0 kubenswrapper[29252]: I1203 20:16:36.318551 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 20:16:36.456032 master-0 kubenswrapper[29252]: I1203 20:16:36.455841 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 20:16:36.546696 master-0 kubenswrapper[29252]: I1203 20:16:36.546617 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 03 20:16:36.547846 master-0 kubenswrapper[29252]: I1203 20:16:36.547771 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 20:16:36.548910 master-0 kubenswrapper[29252]: I1203 20:16:36.548874 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 03 20:16:36.624332 master-0 kubenswrapper[29252]: I1203 20:16:36.624248 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 20:16:36.690810 master-0 kubenswrapper[29252]: I1203 20:16:36.690729 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-x7648" Dec 03 20:16:36.731418 master-0 kubenswrapper[29252]: I1203 20:16:36.731298 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 20:16:36.737934 master-0 kubenswrapper[29252]: I1203 20:16:36.737878 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 20:16:36.783547 master-0 kubenswrapper[29252]: I1203 20:16:36.783404 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 20:16:36.830516 master-0 kubenswrapper[29252]: I1203 20:16:36.830372 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 20:16:36.868377 master-0 kubenswrapper[29252]: I1203 20:16:36.868307 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 20:16:36.951746 master-0 kubenswrapper[29252]: I1203 20:16:36.951640 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 20:16:36.954089 master-0 kubenswrapper[29252]: I1203 20:16:36.954042 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 20:16:36.981879 master-0 kubenswrapper[29252]: I1203 20:16:36.981747 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 20:16:37.050349 master-0 kubenswrapper[29252]: I1203 20:16:37.050227 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 20:16:37.098700 master-0 kubenswrapper[29252]: I1203 20:16:37.098620 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 20:16:37.123385 master-0 kubenswrapper[29252]: I1203 20:16:37.123317 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 20:16:37.225812 master-0 kubenswrapper[29252]: I1203 20:16:37.225682 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 20:16:37.285044 master-0 kubenswrapper[29252]: I1203 20:16:37.284797 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:16:37.305830 master-0 kubenswrapper[29252]: I1203 20:16:37.305691 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 20:16:37.307336 master-0 kubenswrapper[29252]: I1203 20:16:37.307281 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 20:16:37.363613 master-0 kubenswrapper[29252]: I1203 20:16:37.363310 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 20:16:37.376586 master-0 kubenswrapper[29252]: I1203 20:16:37.376500 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 20:16:37.409805 master-0 kubenswrapper[29252]: I1203 20:16:37.400737 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 20:16:37.488014 master-0 kubenswrapper[29252]: I1203 20:16:37.487922 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 20:16:37.489337 master-0 kubenswrapper[29252]: I1203 20:16:37.489269 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-krxhq" Dec 03 20:16:37.613062 master-0 kubenswrapper[29252]: I1203 20:16:37.612961 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 03 20:16:37.621659 master-0 kubenswrapper[29252]: I1203 20:16:37.621581 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 20:16:37.812205 master-0 kubenswrapper[29252]: I1203 20:16:37.812131 29252 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 20:16:37.871070 master-0 kubenswrapper[29252]: I1203 20:16:37.870933 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 20:16:38.118543 master-0 kubenswrapper[29252]: I1203 20:16:38.118451 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 20:16:38.142874 master-0 kubenswrapper[29252]: I1203 20:16:38.142668 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 03 20:16:38.445364 master-0 kubenswrapper[29252]: I1203 20:16:38.445178 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 20:16:38.445364 master-0 kubenswrapper[29252]: I1203 20:16:38.445199 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 20:16:38.500753 master-0 kubenswrapper[29252]: I1203 20:16:38.500670 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 20:16:38.534335 master-0 kubenswrapper[29252]: I1203 20:16:38.534247 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 20:16:38.538681 master-0 kubenswrapper[29252]: I1203 20:16:38.538627 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 20:16:38.558871 master-0 kubenswrapper[29252]: I1203 20:16:38.558817 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 20:16:38.565578 master-0 kubenswrapper[29252]: I1203 20:16:38.565526 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 20:16:38.752193 master-0 kubenswrapper[29252]: I1203 20:16:38.752038 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 20:16:38.782260 master-0 kubenswrapper[29252]: I1203 20:16:38.782185 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 20:16:38.807240 master-0 kubenswrapper[29252]: I1203 20:16:38.807169 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 20:16:38.936890 master-0 kubenswrapper[29252]: I1203 20:16:38.936810 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 20:16:38.972090 master-0 kubenswrapper[29252]: I1203 20:16:38.972019 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 20:16:38.973916 master-0 kubenswrapper[29252]: I1203 20:16:38.973873 29252 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 20:16:38.995812 master-0 kubenswrapper[29252]: I1203 20:16:38.995739 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 20:16:39.021437 master-0 kubenswrapper[29252]: I1203 20:16:39.021264 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-2zv7f" Dec 03 20:16:39.037925 master-0 kubenswrapper[29252]: I1203 20:16:39.037814 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 20:16:39.057548 master-0 kubenswrapper[29252]: I1203 20:16:39.057480 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 20:16:39.404079 master-0 kubenswrapper[29252]: I1203 20:16:39.403998 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 20:16:39.882028 master-0 kubenswrapper[29252]: I1203 20:16:39.881950 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:16:39.904234 master-0 kubenswrapper[29252]: I1203 20:16:39.904176 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-l56l4" Dec 03 20:16:39.908315 master-0 kubenswrapper[29252]: I1203 20:16:39.908271 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 20:16:39.919946 master-0 kubenswrapper[29252]: I1203 20:16:39.919906 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 20:16:39.942005 master-0 kubenswrapper[29252]: I1203 20:16:39.941945 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 20:16:39.943659 master-0 kubenswrapper[29252]: I1203 20:16:39.943618 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:16:39.944005 master-0 kubenswrapper[29252]: E1203 20:16:39.943978 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee0d023-d1ab-4c75-9a92-3a0e42d05168" containerName="installer" Dec 03 20:16:39.944005 master-0 kubenswrapper[29252]: I1203 20:16:39.944003 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee0d023-d1ab-4c75-9a92-3a0e42d05168" containerName="installer" Dec 03 20:16:39.944080 master-0 kubenswrapper[29252]: E1203 20:16:39.944059 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98a8d85d3901d33f6fe192bdc7172aa" containerName="startup-monitor" Dec 03 20:16:39.944080 master-0 kubenswrapper[29252]: I1203 20:16:39.944067 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98a8d85d3901d33f6fe192bdc7172aa" containerName="startup-monitor" Dec 03 20:16:39.944145 master-0 kubenswrapper[29252]: E1203 20:16:39.944083 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" containerName="installer" Dec 03 20:16:39.944145 master-0 kubenswrapper[29252]: I1203 20:16:39.944089 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" containerName="installer" Dec 03 20:16:39.944228 master-0 kubenswrapper[29252]: I1203 20:16:39.944210 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98a8d85d3901d33f6fe192bdc7172aa" containerName="startup-monitor" Dec 03 20:16:39.944263 master-0 kubenswrapper[29252]: I1203 20:16:39.944236 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="108176a9-101d-4204-8ed3-4ed41ccdaae0" containerName="installer" Dec 03 20:16:39.944263 master-0 kubenswrapper[29252]: I1203 20:16:39.944251 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee0d023-d1ab-4c75-9a92-3a0e42d05168" containerName="installer" Dec 03 20:16:39.944810 master-0 kubenswrapper[29252]: I1203 20:16:39.944753 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:39.948668 master-0 kubenswrapper[29252]: I1203 20:16:39.948618 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:16:39.987481 master-0 kubenswrapper[29252]: I1203 20:16:39.987426 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 20:16:40.040175 master-0 kubenswrapper[29252]: I1203 20:16:40.039099 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 20:16:40.048901 master-0 kubenswrapper[29252]: I1203 20:16:40.048848 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 20:16:40.066463 master-0 kubenswrapper[29252]: I1203 20:16:40.065611 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 20:16:40.071362 master-0 kubenswrapper[29252]: I1203 20:16:40.071324 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071448 master-0 kubenswrapper[29252]: I1203 20:16:40.071370 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071448 master-0 kubenswrapper[29252]: I1203 20:16:40.071425 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071515 master-0 kubenswrapper[29252]: I1203 20:16:40.071455 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071515 master-0 kubenswrapper[29252]: I1203 20:16:40.071478 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071515 master-0 kubenswrapper[29252]: I1203 20:16:40.071498 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.071604 master-0 kubenswrapper[29252]: I1203 20:16:40.071534 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdq4\" (UniqueName: \"kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173422 master-0 kubenswrapper[29252]: I1203 20:16:40.173296 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173422 master-0 kubenswrapper[29252]: I1203 20:16:40.173365 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173422 master-0 kubenswrapper[29252]: I1203 20:16:40.173385 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173422 master-0 kubenswrapper[29252]: I1203 20:16:40.173414 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173688 master-0 kubenswrapper[29252]: I1203 20:16:40.173440 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173688 master-0 kubenswrapper[29252]: I1203 20:16:40.173462 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.173688 master-0 kubenswrapper[29252]: I1203 20:16:40.173498 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdq4\" (UniqueName: \"kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.174680 master-0 kubenswrapper[29252]: I1203 20:16:40.174628 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.175043 master-0 kubenswrapper[29252]: I1203 20:16:40.174992 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.175114 master-0 kubenswrapper[29252]: I1203 20:16:40.175007 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.175878 master-0 kubenswrapper[29252]: I1203 20:16:40.175664 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.176986 master-0 kubenswrapper[29252]: I1203 20:16:40.176943 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.178703 master-0 kubenswrapper[29252]: I1203 20:16:40.178668 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.189992 master-0 kubenswrapper[29252]: I1203 20:16:40.189953 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdq4\" (UniqueName: \"kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4\") pod \"console-5656567747-w9bgn\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.246785 master-0 kubenswrapper[29252]: I1203 20:16:40.246716 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 20:16:40.268759 master-0 kubenswrapper[29252]: I1203 20:16:40.268693 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:40.470244 master-0 kubenswrapper[29252]: I1203 20:16:40.470092 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:16:40.604230 master-0 kubenswrapper[29252]: I1203 20:16:40.604149 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 20:16:40.690292 master-0 kubenswrapper[29252]: I1203 20:16:40.690234 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-fekh162m2nm7j" Dec 03 20:16:40.705508 master-0 kubenswrapper[29252]: W1203 20:16:40.705417 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2329bde6_b226_4fca_864d_b152ccf49cf9.slice/crio-7950d43f53958f94be90826f9245aa2b01033789ccdd0dbeec841b3d3de5f8a9 WatchSource:0}: Error finding container 7950d43f53958f94be90826f9245aa2b01033789ccdd0dbeec841b3d3de5f8a9: Status 404 returned error can't find the container with id 7950d43f53958f94be90826f9245aa2b01033789ccdd0dbeec841b3d3de5f8a9 Dec 03 20:16:40.706327 master-0 kubenswrapper[29252]: I1203 20:16:40.706255 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:16:40.864973 master-0 kubenswrapper[29252]: I1203 20:16:40.864925 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_c98a8d85d3901d33f6fe192bdc7172aa/startup-monitor/0.log" Dec 03 20:16:40.865167 master-0 kubenswrapper[29252]: I1203 20:16:40.865057 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:16:40.884221 master-0 kubenswrapper[29252]: I1203 20:16:40.884083 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir\") pod \"c98a8d85d3901d33f6fe192bdc7172aa\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884397 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests\") pod \"c98a8d85d3901d33f6fe192bdc7172aa\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884428 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock\") pod \"c98a8d85d3901d33f6fe192bdc7172aa\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884514 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log\") pod \"c98a8d85d3901d33f6fe192bdc7172aa\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884532 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests" (OuterVolumeSpecName: "manifests") pod "c98a8d85d3901d33f6fe192bdc7172aa" (UID: "c98a8d85d3901d33f6fe192bdc7172aa"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884574 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir\") pod \"c98a8d85d3901d33f6fe192bdc7172aa\" (UID: \"c98a8d85d3901d33f6fe192bdc7172aa\") " Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884582 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock" (OuterVolumeSpecName: "var-lock") pod "c98a8d85d3901d33f6fe192bdc7172aa" (UID: "c98a8d85d3901d33f6fe192bdc7172aa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884633 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log" (OuterVolumeSpecName: "var-log") pod "c98a8d85d3901d33f6fe192bdc7172aa" (UID: "c98a8d85d3901d33f6fe192bdc7172aa"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:40.884797 master-0 kubenswrapper[29252]: I1203 20:16:40.884708 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "c98a8d85d3901d33f6fe192bdc7172aa" (UID: "c98a8d85d3901d33f6fe192bdc7172aa"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:40.885137 master-0 kubenswrapper[29252]: I1203 20:16:40.885105 29252 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-manifests\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:40.885137 master-0 kubenswrapper[29252]: I1203 20:16:40.885133 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:40.885212 master-0 kubenswrapper[29252]: I1203 20:16:40.885146 29252 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-var-log\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:40.885212 master-0 kubenswrapper[29252]: I1203 20:16:40.885157 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:40.895352 master-0 kubenswrapper[29252]: I1203 20:16:40.895254 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "c98a8d85d3901d33f6fe192bdc7172aa" (UID: "c98a8d85d3901d33f6fe192bdc7172aa"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:16:40.951156 master-0 kubenswrapper[29252]: I1203 20:16:40.951101 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_c98a8d85d3901d33f6fe192bdc7172aa/startup-monitor/0.log" Dec 03 20:16:40.951349 master-0 kubenswrapper[29252]: I1203 20:16:40.951178 29252 generic.go:334] "Generic (PLEG): container finished" podID="c98a8d85d3901d33f6fe192bdc7172aa" containerID="0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915" exitCode=137 Dec 03 20:16:40.951349 master-0 kubenswrapper[29252]: I1203 20:16:40.951294 29252 scope.go:117] "RemoveContainer" containerID="0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915" Dec 03 20:16:40.951902 master-0 kubenswrapper[29252]: I1203 20:16:40.951312 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 20:16:40.953527 master-0 kubenswrapper[29252]: I1203 20:16:40.953478 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5656567747-w9bgn" event={"ID":"2329bde6-b226-4fca-864d-b152ccf49cf9","Type":"ContainerStarted","Data":"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d"} Dec 03 20:16:40.953569 master-0 kubenswrapper[29252]: I1203 20:16:40.953534 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5656567747-w9bgn" event={"ID":"2329bde6-b226-4fca-864d-b152ccf49cf9","Type":"ContainerStarted","Data":"7950d43f53958f94be90826f9245aa2b01033789ccdd0dbeec841b3d3de5f8a9"} Dec 03 20:16:40.973898 master-0 kubenswrapper[29252]: I1203 20:16:40.973856 29252 scope.go:117] "RemoveContainer" containerID="0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915" Dec 03 20:16:40.974385 master-0 kubenswrapper[29252]: E1203 20:16:40.974339 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915\": container with ID starting with 0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915 not found: ID does not exist" containerID="0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915" Dec 03 20:16:40.974460 master-0 kubenswrapper[29252]: I1203 20:16:40.974385 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915"} err="failed to get container status \"0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915\": rpc error: code = NotFound desc = could not find container \"0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915\": container with ID starting with 0c90690bf74079ace519f114b7de842fc09b8f38d743012d018a57a54c703915 not found: ID does not exist" Dec 03 20:16:40.977017 master-0 kubenswrapper[29252]: I1203 20:16:40.976941 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5656567747-w9bgn" podStartSLOduration=1.9769227919999999 podStartE2EDuration="1.976922792s" podCreationTimestamp="2025-12-03 20:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:16:40.976750968 +0000 UTC m=+435.790295921" watchObservedRunningTime="2025-12-03 20:16:40.976922792 +0000 UTC m=+435.790467755" Dec 03 20:16:40.986296 master-0 kubenswrapper[29252]: I1203 20:16:40.986235 29252 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/c98a8d85d3901d33f6fe192bdc7172aa-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:16:41.280047 master-0 kubenswrapper[29252]: I1203 20:16:41.279997 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 20:16:41.424457 master-0 kubenswrapper[29252]: I1203 20:16:41.424399 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98a8d85d3901d33f6fe192bdc7172aa" path="/var/lib/kubelet/pods/c98a8d85d3901d33f6fe192bdc7172aa/volumes" Dec 03 20:16:43.000263 master-0 kubenswrapper[29252]: I1203 20:16:43.000205 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-kfnzv" Dec 03 20:16:50.270460 master-0 kubenswrapper[29252]: I1203 20:16:50.270355 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:50.270460 master-0 kubenswrapper[29252]: I1203 20:16:50.270441 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:50.277165 master-0 kubenswrapper[29252]: I1203 20:16:50.277083 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:51.056209 master-0 kubenswrapper[29252]: I1203 20:16:51.056098 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:16:51.173403 master-0 kubenswrapper[29252]: I1203 20:16:51.173049 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:17:04.933743 master-0 kubenswrapper[29252]: I1203 20:17:04.933586 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-65c74dc56f-mlqjw" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" containerID="cri-o://f54a06368dd236747f03ccfb28200dab1a76dccafc53a58a78b24d448650bca8" gracePeriod=15 Dec 03 20:17:05.194309 master-0 kubenswrapper[29252]: I1203 20:17:05.194144 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c74dc56f-mlqjw_e13ed7cc-6322-4676-88fc-363cff00f509/console/0.log" Dec 03 20:17:05.194309 master-0 kubenswrapper[29252]: I1203 20:17:05.194196 29252 generic.go:334] "Generic (PLEG): container finished" podID="e13ed7cc-6322-4676-88fc-363cff00f509" containerID="f54a06368dd236747f03ccfb28200dab1a76dccafc53a58a78b24d448650bca8" exitCode=2 Dec 03 20:17:05.194309 master-0 kubenswrapper[29252]: I1203 20:17:05.194226 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c74dc56f-mlqjw" event={"ID":"e13ed7cc-6322-4676-88fc-363cff00f509","Type":"ContainerDied","Data":"f54a06368dd236747f03ccfb28200dab1a76dccafc53a58a78b24d448650bca8"} Dec 03 20:17:05.445424 master-0 kubenswrapper[29252]: I1203 20:17:05.445277 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c74dc56f-mlqjw_e13ed7cc-6322-4676-88fc-363cff00f509/console/0.log" Dec 03 20:17:05.445424 master-0 kubenswrapper[29252]: I1203 20:17:05.445362 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.513853 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.513975 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.514022 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.514083 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.514114 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hs6x\" (UniqueName: \"kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.514140 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.514900 master-0 kubenswrapper[29252]: I1203 20:17:05.514161 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert\") pod \"e13ed7cc-6322-4676-88fc-363cff00f509\" (UID: \"e13ed7cc-6322-4676-88fc-363cff00f509\") " Dec 03 20:17:05.515605 master-0 kubenswrapper[29252]: I1203 20:17:05.515554 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca" (OuterVolumeSpecName: "service-ca") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:05.516848 master-0 kubenswrapper[29252]: I1203 20:17:05.516689 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:05.516941 master-0 kubenswrapper[29252]: I1203 20:17:05.516859 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:05.517078 master-0 kubenswrapper[29252]: I1203 20:17:05.517029 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config" (OuterVolumeSpecName: "console-config") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:05.518167 master-0 kubenswrapper[29252]: I1203 20:17:05.518129 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:05.520062 master-0 kubenswrapper[29252]: I1203 20:17:05.520026 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:05.522121 master-0 kubenswrapper[29252]: I1203 20:17:05.522053 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x" (OuterVolumeSpecName: "kube-api-access-5hs6x") pod "e13ed7cc-6322-4676-88fc-363cff00f509" (UID: "e13ed7cc-6322-4676-88fc-363cff00f509"). InnerVolumeSpecName "kube-api-access-5hs6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616106 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616148 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616164 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616177 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e13ed7cc-6322-4676-88fc-363cff00f509-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616189 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hs6x\" (UniqueName: \"kubernetes.io/projected/e13ed7cc-6322-4676-88fc-363cff00f509-kube-api-access-5hs6x\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616200 29252 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.616463 master-0 kubenswrapper[29252]: I1203 20:17:05.616212 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e13ed7cc-6322-4676-88fc-363cff00f509-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:05.844000 master-0 kubenswrapper[29252]: I1203 20:17:05.843871 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:05.844214 master-0 kubenswrapper[29252]: I1203 20:17:05.844182 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="prometheus" containerID="cri-o://1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" gracePeriod=600 Dec 03 20:17:05.844356 master-0 kubenswrapper[29252]: I1203 20:17:05.844292 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-web" containerID="cri-o://f181c4fccef6b5098a5cf8c94ad1bdd8de61d3ad6b762757fe9a7bdfb22c63a5" gracePeriod=600 Dec 03 20:17:05.844471 master-0 kubenswrapper[29252]: I1203 20:17:05.844316 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="config-reloader" containerID="cri-o://3ad78927878b51cd7f4848ccccfad56ebf4205f807aa88db1abae4a630d88721" gracePeriod=600 Dec 03 20:17:05.844471 master-0 kubenswrapper[29252]: I1203 20:17:05.844318 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="thanos-sidecar" containerID="cri-o://e1de45814aaa6ff018c658de9e1964f6f62f6d7e90feb94cdb56eb7462f11358" gracePeriod=600 Dec 03 20:17:05.844471 master-0 kubenswrapper[29252]: I1203 20:17:05.844417 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-thanos" containerID="cri-o://da1fa4988d2b94cb2e703112b63c61fa999bce00d13cedeb5375da4e716443a5" gracePeriod=600 Dec 03 20:17:05.844567 master-0 kubenswrapper[29252]: I1203 20:17:05.844266 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy" containerID="cri-o://0bd8d740d658995d8dd28eae580836ac58f46ce3526c9a1e22c7f53333d01c60" gracePeriod=600 Dec 03 20:17:06.088442 master-0 kubenswrapper[29252]: E1203 20:17:06.088378 29252 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad is running failed: container process not found" containerID="1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 20:17:06.089161 master-0 kubenswrapper[29252]: E1203 20:17:06.088624 29252 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad is running failed: container process not found" containerID="1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 20:17:06.089161 master-0 kubenswrapper[29252]: E1203 20:17:06.088988 29252 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad is running failed: container process not found" containerID="1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 20:17:06.089161 master-0 kubenswrapper[29252]: E1203 20:17:06.089044 29252 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="prometheus" Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244524 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="da1fa4988d2b94cb2e703112b63c61fa999bce00d13cedeb5375da4e716443a5" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244560 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="0bd8d740d658995d8dd28eae580836ac58f46ce3526c9a1e22c7f53333d01c60" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244569 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="f181c4fccef6b5098a5cf8c94ad1bdd8de61d3ad6b762757fe9a7bdfb22c63a5" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244576 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="e1de45814aaa6ff018c658de9e1964f6f62f6d7e90feb94cdb56eb7462f11358" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244584 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="3ad78927878b51cd7f4848ccccfad56ebf4205f807aa88db1abae4a630d88721" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244593 29252 generic.go:334] "Generic (PLEG): container finished" podID="f3560529-2f6a-4193-b606-18474b120488" containerID="1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" exitCode=0 Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244623 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"da1fa4988d2b94cb2e703112b63c61fa999bce00d13cedeb5375da4e716443a5"} Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244690 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"0bd8d740d658995d8dd28eae580836ac58f46ce3526c9a1e22c7f53333d01c60"} Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244714 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"f181c4fccef6b5098a5cf8c94ad1bdd8de61d3ad6b762757fe9a7bdfb22c63a5"} Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244732 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"e1de45814aaa6ff018c658de9e1964f6f62f6d7e90feb94cdb56eb7462f11358"} Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244750 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"3ad78927878b51cd7f4848ccccfad56ebf4205f807aa88db1abae4a630d88721"} Dec 03 20:17:06.244881 master-0 kubenswrapper[29252]: I1203 20:17:06.244766 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad"} Dec 03 20:17:06.247658 master-0 kubenswrapper[29252]: I1203 20:17:06.247603 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65c74dc56f-mlqjw_e13ed7cc-6322-4676-88fc-363cff00f509/console/0.log" Dec 03 20:17:06.247798 master-0 kubenswrapper[29252]: I1203 20:17:06.247682 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65c74dc56f-mlqjw" event={"ID":"e13ed7cc-6322-4676-88fc-363cff00f509","Type":"ContainerDied","Data":"b64169813e76f27d5b81af1f242ceb058bc9e3bf5600cb7456188abdcedde17d"} Dec 03 20:17:06.247798 master-0 kubenswrapper[29252]: I1203 20:17:06.247745 29252 scope.go:117] "RemoveContainer" containerID="f54a06368dd236747f03ccfb28200dab1a76dccafc53a58a78b24d448650bca8" Dec 03 20:17:06.248985 master-0 kubenswrapper[29252]: I1203 20:17:06.248767 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65c74dc56f-mlqjw" Dec 03 20:17:06.323454 master-0 kubenswrapper[29252]: I1203 20:17:06.323416 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:06.331052 master-0 kubenswrapper[29252]: I1203 20:17:06.330998 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:17:06.345111 master-0 kubenswrapper[29252]: I1203 20:17:06.345041 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65c74dc56f-mlqjw"] Dec 03 20:17:06.445053 master-0 kubenswrapper[29252]: I1203 20:17:06.445010 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445053 master-0 kubenswrapper[29252]: I1203 20:17:06.445052 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445073 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79xnq\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445129 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445149 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445176 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445194 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445212 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445248 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445266 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445281 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445293 master-0 kubenswrapper[29252]: I1203 20:17:06.445297 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445314 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445338 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445372 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445392 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445418 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.445874 master-0 kubenswrapper[29252]: I1203 20:17:06.445439 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls\") pod \"f3560529-2f6a-4193-b606-18474b120488\" (UID: \"f3560529-2f6a-4193-b606-18474b120488\") " Dec 03 20:17:06.446477 master-0 kubenswrapper[29252]: I1203 20:17:06.446408 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:06.447267 master-0 kubenswrapper[29252]: I1203 20:17:06.447189 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:06.447515 master-0 kubenswrapper[29252]: I1203 20:17:06.447466 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:06.447576 master-0 kubenswrapper[29252]: I1203 20:17:06.447503 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:06.449067 master-0 kubenswrapper[29252]: I1203 20:17:06.449036 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.449253 master-0 kubenswrapper[29252]: I1203 20:17:06.449216 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq" (OuterVolumeSpecName: "kube-api-access-79xnq") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "kube-api-access-79xnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:06.449524 master-0 kubenswrapper[29252]: I1203 20:17:06.449494 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.449890 master-0 kubenswrapper[29252]: I1203 20:17:06.449850 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:17:06.449964 master-0 kubenswrapper[29252]: I1203 20:17:06.449929 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.450022 master-0 kubenswrapper[29252]: I1203 20:17:06.449952 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:06.450255 master-0 kubenswrapper[29252]: I1203 20:17:06.450221 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out" (OuterVolumeSpecName: "config-out") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:17:06.450594 master-0 kubenswrapper[29252]: I1203 20:17:06.450551 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config" (OuterVolumeSpecName: "config") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.451343 master-0 kubenswrapper[29252]: I1203 20:17:06.451305 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.451769 master-0 kubenswrapper[29252]: I1203 20:17:06.451731 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.451862 master-0 kubenswrapper[29252]: I1203 20:17:06.451843 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.452038 master-0 kubenswrapper[29252]: I1203 20:17:06.452004 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.453501 master-0 kubenswrapper[29252]: I1203 20:17:06.453450 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:06.489308 master-0 kubenswrapper[29252]: I1203 20:17:06.489257 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config" (OuterVolumeSpecName: "web-config") pod "f3560529-2f6a-4193-b606-18474b120488" (UID: "f3560529-2f6a-4193-b606-18474b120488"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546891 29252 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546931 29252 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546942 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79xnq\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-kube-api-access-79xnq\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546954 29252 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546965 29252 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.546959 master-0 kubenswrapper[29252]: I1203 20:17:06.546975 29252 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-config-out\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.546986 29252 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.546996 29252 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547005 29252 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547014 29252 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547022 29252 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547031 29252 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547039 29252 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3560529-2f6a-4193-b606-18474b120488-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547048 29252 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3560529-2f6a-4193-b606-18474b120488-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547056 29252 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-web-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547065 29252 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3560529-2f6a-4193-b606-18474b120488-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547073 29252 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:06.547320 master-0 kubenswrapper[29252]: I1203 20:17:06.547082 29252 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3560529-2f6a-4193-b606-18474b120488-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:07.264204 master-0 kubenswrapper[29252]: I1203 20:17:07.264130 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3560529-2f6a-4193-b606-18474b120488","Type":"ContainerDied","Data":"d51105b3dc08f7680d5b29f7a5ad44e8c57e73d108687231c5eb70618423681a"} Dec 03 20:17:07.264204 master-0 kubenswrapper[29252]: I1203 20:17:07.264215 29252 scope.go:117] "RemoveContainer" containerID="da1fa4988d2b94cb2e703112b63c61fa999bce00d13cedeb5375da4e716443a5" Dec 03 20:17:07.265214 master-0 kubenswrapper[29252]: I1203 20:17:07.264241 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.287346 master-0 kubenswrapper[29252]: I1203 20:17:07.287311 29252 scope.go:117] "RemoveContainer" containerID="0bd8d740d658995d8dd28eae580836ac58f46ce3526c9a1e22c7f53333d01c60" Dec 03 20:17:07.307041 master-0 kubenswrapper[29252]: I1203 20:17:07.307000 29252 scope.go:117] "RemoveContainer" containerID="f181c4fccef6b5098a5cf8c94ad1bdd8de61d3ad6b762757fe9a7bdfb22c63a5" Dec 03 20:17:07.316218 master-0 kubenswrapper[29252]: I1203 20:17:07.316163 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:07.324737 master-0 kubenswrapper[29252]: I1203 20:17:07.324663 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:07.326939 master-0 kubenswrapper[29252]: I1203 20:17:07.326903 29252 scope.go:117] "RemoveContainer" containerID="e1de45814aaa6ff018c658de9e1964f6f62f6d7e90feb94cdb56eb7462f11358" Dec 03 20:17:07.341497 master-0 kubenswrapper[29252]: I1203 20:17:07.341444 29252 scope.go:117] "RemoveContainer" containerID="3ad78927878b51cd7f4848ccccfad56ebf4205f807aa88db1abae4a630d88721" Dec 03 20:17:07.365089 master-0 kubenswrapper[29252]: I1203 20:17:07.365030 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:07.365326 master-0 kubenswrapper[29252]: E1203 20:17:07.365315 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-web" Dec 03 20:17:07.365374 master-0 kubenswrapper[29252]: I1203 20:17:07.365329 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-web" Dec 03 20:17:07.365374 master-0 kubenswrapper[29252]: E1203 20:17:07.365347 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy" Dec 03 20:17:07.365374 master-0 kubenswrapper[29252]: I1203 20:17:07.365353 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy" Dec 03 20:17:07.365374 master-0 kubenswrapper[29252]: E1203 20:17:07.365368 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="config-reloader" Dec 03 20:17:07.365374 master-0 kubenswrapper[29252]: I1203 20:17:07.365374 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="config-reloader" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: E1203 20:17:07.365385 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="thanos-sidecar" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365392 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="thanos-sidecar" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: E1203 20:17:07.365402 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-thanos" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365408 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-thanos" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: E1203 20:17:07.365419 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365424 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: E1203 20:17:07.365433 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="prometheus" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365438 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="prometheus" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: E1203 20:17:07.365446 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="init-config-reloader" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365452 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="init-config-reloader" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365556 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-web" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365563 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy-thanos" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365579 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="config-reloader" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365591 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="prometheus" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365610 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="thanos-sidecar" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365616 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3560529-2f6a-4193-b606-18474b120488" containerName="kube-rbac-proxy" Dec 03 20:17:07.365648 master-0 kubenswrapper[29252]: I1203 20:17:07.365629 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" containerName="console" Dec 03 20:17:07.367764 master-0 kubenswrapper[29252]: I1203 20:17:07.367725 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.372854 master-0 kubenswrapper[29252]: I1203 20:17:07.372808 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 20:17:07.373211 master-0 kubenswrapper[29252]: I1203 20:17:07.372851 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 20:17:07.374593 master-0 kubenswrapper[29252]: I1203 20:17:07.374553 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4vlhrftdrf07t" Dec 03 20:17:07.375366 master-0 kubenswrapper[29252]: I1203 20:17:07.375313 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 20:17:07.375421 master-0 kubenswrapper[29252]: I1203 20:17:07.375346 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 20:17:07.375713 master-0 kubenswrapper[29252]: I1203 20:17:07.375672 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 20:17:07.375915 master-0 kubenswrapper[29252]: I1203 20:17:07.375731 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 20:17:07.375915 master-0 kubenswrapper[29252]: I1203 20:17:07.375417 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 20:17:07.376983 master-0 kubenswrapper[29252]: I1203 20:17:07.375432 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 20:17:07.377109 master-0 kubenswrapper[29252]: I1203 20:17:07.376041 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-s7kpg" Dec 03 20:17:07.377177 master-0 kubenswrapper[29252]: I1203 20:17:07.376197 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 20:17:07.383990 master-0 kubenswrapper[29252]: I1203 20:17:07.380750 29252 scope.go:117] "RemoveContainer" containerID="1d7cd4ec51aa6a09535a7654afad8e7721a38290da904e6611ba943292b9d2ad" Dec 03 20:17:07.385041 master-0 kubenswrapper[29252]: I1203 20:17:07.385008 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 20:17:07.390019 master-0 kubenswrapper[29252]: I1203 20:17:07.389972 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 20:17:07.407590 master-0 kubenswrapper[29252]: I1203 20:17:07.407252 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:07.456061 master-0 kubenswrapper[29252]: I1203 20:17:07.456014 29252 scope.go:117] "RemoveContainer" containerID="78c0709728fd6de20a4742e9ccb3c8eafe7bbbc9e3f8cb53aeaf2205b70b7762" Dec 03 20:17:07.456496 master-0 kubenswrapper[29252]: I1203 20:17:07.456346 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13ed7cc-6322-4676-88fc-363cff00f509" path="/var/lib/kubelet/pods/e13ed7cc-6322-4676-88fc-363cff00f509/volumes" Dec 03 20:17:07.457260 master-0 kubenswrapper[29252]: I1203 20:17:07.457215 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3560529-2f6a-4193-b606-18474b120488" path="/var/lib/kubelet/pods/f3560529-2f6a-4193-b606-18474b120488/volumes" Dec 03 20:17:07.563464 master-0 kubenswrapper[29252]: I1203 20:17:07.563349 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.563464 master-0 kubenswrapper[29252]: I1203 20:17:07.563396 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.563464 master-0 kubenswrapper[29252]: I1203 20:17:07.563418 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbrv\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-kube-api-access-ddbrv\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.564136 master-0 kubenswrapper[29252]: I1203 20:17:07.564088 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566465 master-0 kubenswrapper[29252]: I1203 20:17:07.566437 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566548 master-0 kubenswrapper[29252]: I1203 20:17:07.566479 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566548 master-0 kubenswrapper[29252]: I1203 20:17:07.566504 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566548 master-0 kubenswrapper[29252]: I1203 20:17:07.566538 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566688 master-0 kubenswrapper[29252]: I1203 20:17:07.566563 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566688 master-0 kubenswrapper[29252]: I1203 20:17:07.566589 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566688 master-0 kubenswrapper[29252]: I1203 20:17:07.566628 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566829 master-0 kubenswrapper[29252]: I1203 20:17:07.566694 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566829 master-0 kubenswrapper[29252]: I1203 20:17:07.566734 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566829 master-0 kubenswrapper[29252]: I1203 20:17:07.566757 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.566829 master-0 kubenswrapper[29252]: I1203 20:17:07.566794 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.567004 master-0 kubenswrapper[29252]: I1203 20:17:07.566841 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.567004 master-0 kubenswrapper[29252]: I1203 20:17:07.566873 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.567093 master-0 kubenswrapper[29252]: I1203 20:17:07.567034 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.669366 master-0 kubenswrapper[29252]: I1203 20:17:07.669250 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.669718 master-0 kubenswrapper[29252]: I1203 20:17:07.669338 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.670078 master-0 kubenswrapper[29252]: I1203 20:17:07.670025 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.670410 master-0 kubenswrapper[29252]: I1203 20:17:07.670317 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.670835 master-0 kubenswrapper[29252]: I1203 20:17:07.670759 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.670943 master-0 kubenswrapper[29252]: I1203 20:17:07.670919 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.671716 master-0 kubenswrapper[29252]: I1203 20:17:07.671629 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.671716 master-0 kubenswrapper[29252]: I1203 20:17:07.671647 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672030 master-0 kubenswrapper[29252]: I1203 20:17:07.671917 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672030 master-0 kubenswrapper[29252]: I1203 20:17:07.671966 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672030 master-0 kubenswrapper[29252]: I1203 20:17:07.672013 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbrv\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-kube-api-access-ddbrv\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672262 master-0 kubenswrapper[29252]: I1203 20:17:07.672057 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672262 master-0 kubenswrapper[29252]: I1203 20:17:07.672112 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672262 master-0 kubenswrapper[29252]: I1203 20:17:07.672150 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672262 master-0 kubenswrapper[29252]: I1203 20:17:07.672188 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672262 master-0 kubenswrapper[29252]: I1203 20:17:07.672221 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672596 master-0 kubenswrapper[29252]: I1203 20:17:07.672267 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672596 master-0 kubenswrapper[29252]: I1203 20:17:07.672327 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672596 master-0 kubenswrapper[29252]: I1203 20:17:07.672428 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.672596 master-0 kubenswrapper[29252]: I1203 20:17:07.672478 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.673290 master-0 kubenswrapper[29252]: I1203 20:17:07.673245 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.674830 master-0 kubenswrapper[29252]: I1203 20:17:07.674741 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.675049 master-0 kubenswrapper[29252]: I1203 20:17:07.675005 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.677823 master-0 kubenswrapper[29252]: I1203 20:17:07.677741 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.678386 master-0 kubenswrapper[29252]: I1203 20:17:07.678343 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.678551 master-0 kubenswrapper[29252]: I1203 20:17:07.678496 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.679013 master-0 kubenswrapper[29252]: I1203 20:17:07.678929 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.679155 master-0 kubenswrapper[29252]: I1203 20:17:07.679070 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.679588 master-0 kubenswrapper[29252]: I1203 20:17:07.679515 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.679724 master-0 kubenswrapper[29252]: I1203 20:17:07.679684 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.681704 master-0 kubenswrapper[29252]: I1203 20:17:07.680406 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-config\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.681704 master-0 kubenswrapper[29252]: I1203 20:17:07.681604 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.683116 master-0 kubenswrapper[29252]: I1203 20:17:07.683080 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8c938b9-2779-4966-bdcb-3dfad66828e2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.684108 master-0 kubenswrapper[29252]: I1203 20:17:07.684062 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8c938b9-2779-4966-bdcb-3dfad66828e2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.687687 master-0 kubenswrapper[29252]: I1203 20:17:07.687626 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8c938b9-2779-4966-bdcb-3dfad66828e2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.706197 master-0 kubenswrapper[29252]: I1203 20:17:07.706128 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbrv\" (UniqueName: \"kubernetes.io/projected/e8c938b9-2779-4966-bdcb-3dfad66828e2-kube-api-access-ddbrv\") pod \"prometheus-k8s-0\" (UID: \"e8c938b9-2779-4966-bdcb-3dfad66828e2\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:07.733886 master-0 kubenswrapper[29252]: I1203 20:17:07.733216 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:08.172309 master-0 kubenswrapper[29252]: I1203 20:17:08.172222 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 20:17:08.172707 master-0 kubenswrapper[29252]: W1203 20:17:08.172655 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c938b9_2779_4966_bdcb_3dfad66828e2.slice/crio-3dc691fa43905aa30acf1c0f6b8aeff46646ac28315ef22a6a78175c3a5c5ba9 WatchSource:0}: Error finding container 3dc691fa43905aa30acf1c0f6b8aeff46646ac28315ef22a6a78175c3a5c5ba9: Status 404 returned error can't find the container with id 3dc691fa43905aa30acf1c0f6b8aeff46646ac28315ef22a6a78175c3a5c5ba9 Dec 03 20:17:08.273484 master-0 kubenswrapper[29252]: I1203 20:17:08.273427 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"3dc691fa43905aa30acf1c0f6b8aeff46646ac28315ef22a6a78175c3a5c5ba9"} Dec 03 20:17:09.287431 master-0 kubenswrapper[29252]: I1203 20:17:09.286738 29252 generic.go:334] "Generic (PLEG): container finished" podID="e8c938b9-2779-4966-bdcb-3dfad66828e2" containerID="5564f8073665daaa255671f61fdeef636446b25742b321c159c5aceed43bfb79" exitCode=0 Dec 03 20:17:09.287431 master-0 kubenswrapper[29252]: I1203 20:17:09.286837 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerDied","Data":"5564f8073665daaa255671f61fdeef636446b25742b321c159c5aceed43bfb79"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296034 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"be07a49548a6f1132780ffbc47b123da2e39a53084936ab5821c182a672652d3"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296087 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"cf93ae07710079a8aa0a3c5a323da49ad779c9305de6afb27870c2e5473085ae"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296100 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"705fe4a9939ffbd3e1090271a585f9b683d2a0a94438010f6820503a14b62118"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296114 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"3d6721c6f4ddde4aa6bb1c615b82f22f501b2c656c2b30ec2b01f3fc6b54e37c"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296127 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"7a1f7f9b7d1092dfcfc1c2db672e64a2e571c9e9fd006c2181747248cf9debf9"} Dec 03 20:17:10.296125 master-0 kubenswrapper[29252]: I1203 20:17:10.296140 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8c938b9-2779-4966-bdcb-3dfad66828e2","Type":"ContainerStarted","Data":"b195e9342918b00f9d7ff9322b8a458cc070fbbcf0019d5a6aba3548ed7f0b45"} Dec 03 20:17:10.333898 master-0 kubenswrapper[29252]: I1203 20:17:10.333707 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.33367946 podStartE2EDuration="3.33367946s" podCreationTimestamp="2025-12-03 20:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:17:10.330837571 +0000 UTC m=+465.144382534" watchObservedRunningTime="2025-12-03 20:17:10.33367946 +0000 UTC m=+465.147224453" Dec 03 20:17:12.734144 master-0 kubenswrapper[29252]: I1203 20:17:12.734084 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:17:16.234857 master-0 kubenswrapper[29252]: I1203 20:17:16.234745 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6465b775c-7mmtn" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" containerID="cri-o://d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68" gracePeriod=15 Dec 03 20:17:16.660989 master-0 kubenswrapper[29252]: I1203 20:17:16.660932 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6465b775c-7mmtn_3d2903de-a51a-415a-80be-9ba79b4e173d/console/0.log" Dec 03 20:17:16.661192 master-0 kubenswrapper[29252]: I1203 20:17:16.661033 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:17:16.721807 master-0 kubenswrapper[29252]: I1203 20:17:16.721732 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.721994 master-0 kubenswrapper[29252]: I1203 20:17:16.721828 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lsj6\" (UniqueName: \"kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.721994 master-0 kubenswrapper[29252]: I1203 20:17:16.721889 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.721994 master-0 kubenswrapper[29252]: I1203 20:17:16.721928 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.722094 master-0 kubenswrapper[29252]: I1203 20:17:16.722009 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.722094 master-0 kubenswrapper[29252]: I1203 20:17:16.722030 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.722094 master-0 kubenswrapper[29252]: I1203 20:17:16.722071 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca\") pod \"3d2903de-a51a-415a-80be-9ba79b4e173d\" (UID: \"3d2903de-a51a-415a-80be-9ba79b4e173d\") " Dec 03 20:17:16.722666 master-0 kubenswrapper[29252]: I1203 20:17:16.722612 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:16.722666 master-0 kubenswrapper[29252]: I1203 20:17:16.722653 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config" (OuterVolumeSpecName: "console-config") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:16.723162 master-0 kubenswrapper[29252]: I1203 20:17:16.723101 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:16.723471 master-0 kubenswrapper[29252]: I1203 20:17:16.723423 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca" (OuterVolumeSpecName: "service-ca") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:16.724423 master-0 kubenswrapper[29252]: I1203 20:17:16.724382 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:16.724895 master-0 kubenswrapper[29252]: I1203 20:17:16.724857 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6" (OuterVolumeSpecName: "kube-api-access-2lsj6") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "kube-api-access-2lsj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:16.725919 master-0 kubenswrapper[29252]: I1203 20:17:16.725754 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3d2903de-a51a-415a-80be-9ba79b4e173d" (UID: "3d2903de-a51a-415a-80be-9ba79b4e173d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:16.825387 master-0 kubenswrapper[29252]: I1203 20:17:16.825274 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825387 master-0 kubenswrapper[29252]: I1203 20:17:16.825379 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lsj6\" (UniqueName: \"kubernetes.io/projected/3d2903de-a51a-415a-80be-9ba79b4e173d-kube-api-access-2lsj6\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825387 master-0 kubenswrapper[29252]: I1203 20:17:16.825399 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2903de-a51a-415a-80be-9ba79b4e173d-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825667 master-0 kubenswrapper[29252]: I1203 20:17:16.825422 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825667 master-0 kubenswrapper[29252]: I1203 20:17:16.825449 29252 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825667 master-0 kubenswrapper[29252]: I1203 20:17:16.825475 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:16.825667 master-0 kubenswrapper[29252]: I1203 20:17:16.825498 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2903de-a51a-415a-80be-9ba79b4e173d-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:17.353151 master-0 kubenswrapper[29252]: I1203 20:17:17.353104 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6465b775c-7mmtn_3d2903de-a51a-415a-80be-9ba79b4e173d/console/0.log" Dec 03 20:17:17.354141 master-0 kubenswrapper[29252]: I1203 20:17:17.354062 29252 generic.go:334] "Generic (PLEG): container finished" podID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerID="d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68" exitCode=2 Dec 03 20:17:17.354141 master-0 kubenswrapper[29252]: I1203 20:17:17.354117 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6465b775c-7mmtn" event={"ID":"3d2903de-a51a-415a-80be-9ba79b4e173d","Type":"ContainerDied","Data":"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68"} Dec 03 20:17:17.354141 master-0 kubenswrapper[29252]: I1203 20:17:17.354139 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6465b775c-7mmtn" Dec 03 20:17:17.354483 master-0 kubenswrapper[29252]: I1203 20:17:17.354166 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6465b775c-7mmtn" event={"ID":"3d2903de-a51a-415a-80be-9ba79b4e173d","Type":"ContainerDied","Data":"4f93b8b809a825c50c7be9c2190dbcd0c45c4c033c75a39e6240c325280e1f11"} Dec 03 20:17:17.354483 master-0 kubenswrapper[29252]: I1203 20:17:17.354187 29252 scope.go:117] "RemoveContainer" containerID="d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68" Dec 03 20:17:17.370144 master-0 kubenswrapper[29252]: I1203 20:17:17.370104 29252 scope.go:117] "RemoveContainer" containerID="d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68" Dec 03 20:17:17.370493 master-0 kubenswrapper[29252]: E1203 20:17:17.370457 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68\": container with ID starting with d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68 not found: ID does not exist" containerID="d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68" Dec 03 20:17:17.370598 master-0 kubenswrapper[29252]: I1203 20:17:17.370494 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68"} err="failed to get container status \"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68\": rpc error: code = NotFound desc = could not find container \"d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68\": container with ID starting with d030e7d53aba644d51860afb7ad0c94bcbff1374eeb55840348470bf3f43fa68 not found: ID does not exist" Dec 03 20:17:17.396478 master-0 kubenswrapper[29252]: I1203 20:17:17.396393 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:17:17.402833 master-0 kubenswrapper[29252]: I1203 20:17:17.402727 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6465b775c-7mmtn"] Dec 03 20:17:17.425395 master-0 kubenswrapper[29252]: I1203 20:17:17.425088 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" path="/var/lib/kubelet/pods/3d2903de-a51a-415a-80be-9ba79b4e173d/volumes" Dec 03 20:17:24.949597 master-0 kubenswrapper[29252]: I1203 20:17:24.949528 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.949953 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="alertmanager" containerID="cri-o://0976a89f464ebb972c93a46088e9eeb54bd3bcf4771fafb2ab2a84f679391cfb" gracePeriod=120 Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.950021 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy" containerID="cri-o://34ce07503848cd6aad62ba91f3c407cfb5a322733a238fe0815d01f74c614873" gracePeriod=120 Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.950089 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="config-reloader" containerID="cri-o://41edc0c3867479b941ec31180dac8bca736f22ef5242e5d1acff2ee882afe88a" gracePeriod=120 Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.950049 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-web" containerID="cri-o://47a401e5c33c18bb1cfb970151b713d5420adf0b84eb4a88be63ba450bd5a61b" gracePeriod=120 Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.950091 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-metric" containerID="cri-o://a47d8d5fc9fb8d8f5c161e8c2f4a0a8e14e1a13017007439234149dd1f6a68f4" gracePeriod=120 Dec 03 20:17:24.950364 master-0 kubenswrapper[29252]: I1203 20:17:24.950004 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="prom-label-proxy" containerID="cri-o://1451a79631eaf16c9eb478f51661577aa37eaea15ed18eb83a425743c7c87e7e" gracePeriod=120 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434459 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="1451a79631eaf16c9eb478f51661577aa37eaea15ed18eb83a425743c7c87e7e" exitCode=0 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434590 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="a47d8d5fc9fb8d8f5c161e8c2f4a0a8e14e1a13017007439234149dd1f6a68f4" exitCode=0 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434612 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="34ce07503848cd6aad62ba91f3c407cfb5a322733a238fe0815d01f74c614873" exitCode=0 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434624 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="47a401e5c33c18bb1cfb970151b713d5420adf0b84eb4a88be63ba450bd5a61b" exitCode=0 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434634 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="41edc0c3867479b941ec31180dac8bca736f22ef5242e5d1acff2ee882afe88a" exitCode=0 Dec 03 20:17:25.434941 master-0 kubenswrapper[29252]: I1203 20:17:25.434679 29252 generic.go:334] "Generic (PLEG): container finished" podID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerID="0976a89f464ebb972c93a46088e9eeb54bd3bcf4771fafb2ab2a84f679391cfb" exitCode=0 Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439354 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"1451a79631eaf16c9eb478f51661577aa37eaea15ed18eb83a425743c7c87e7e"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439417 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"a47d8d5fc9fb8d8f5c161e8c2f4a0a8e14e1a13017007439234149dd1f6a68f4"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439437 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"34ce07503848cd6aad62ba91f3c407cfb5a322733a238fe0815d01f74c614873"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439455 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"47a401e5c33c18bb1cfb970151b713d5420adf0b84eb4a88be63ba450bd5a61b"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439472 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"41edc0c3867479b941ec31180dac8bca736f22ef5242e5d1acff2ee882afe88a"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439489 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"0976a89f464ebb972c93a46088e9eeb54bd3bcf4771fafb2ab2a84f679391cfb"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439507 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d","Type":"ContainerDied","Data":"458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3"} Dec 03 20:17:25.439554 master-0 kubenswrapper[29252]: I1203 20:17:25.439525 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458ccf479cec44e7661623ecdcdf188ad3fcab81c988b82ff57e5946294972a3" Dec 03 20:17:25.444461 master-0 kubenswrapper[29252]: I1203 20:17:25.444433 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:25.469334 master-0 kubenswrapper[29252]: I1203 20:17:25.469290 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.469630 master-0 kubenswrapper[29252]: I1203 20:17:25.469614 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twgrr\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.469723 master-0 kubenswrapper[29252]: I1203 20:17:25.469710 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.469843 master-0 kubenswrapper[29252]: I1203 20:17:25.469830 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.470173 master-0 kubenswrapper[29252]: I1203 20:17:25.470159 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.470276 master-0 kubenswrapper[29252]: I1203 20:17:25.470264 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.470386 master-0 kubenswrapper[29252]: I1203 20:17:25.470372 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.470492 master-0 kubenswrapper[29252]: I1203 20:17:25.470481 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.470577 master-0 kubenswrapper[29252]: I1203 20:17:25.470565 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.471066 master-0 kubenswrapper[29252]: I1203 20:17:25.471050 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.471171 master-0 kubenswrapper[29252]: I1203 20:17:25.471158 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.471250 master-0 kubenswrapper[29252]: I1203 20:17:25.471238 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets\") pod \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\" (UID: \"4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d\") " Dec 03 20:17:25.471484 master-0 kubenswrapper[29252]: I1203 20:17:25.470571 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:25.471484 master-0 kubenswrapper[29252]: I1203 20:17:25.471383 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:17:25.471924 master-0 kubenswrapper[29252]: I1203 20:17:25.471908 29252 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.472225 master-0 kubenswrapper[29252]: I1203 20:17:25.472213 29252 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.472289 master-0 kubenswrapper[29252]: I1203 20:17:25.471893 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:17:25.478333 master-0 kubenswrapper[29252]: I1203 20:17:25.478277 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.478333 master-0 kubenswrapper[29252]: I1203 20:17:25.478306 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.478575 master-0 kubenswrapper[29252]: I1203 20:17:25.478376 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.478823 master-0 kubenswrapper[29252]: I1203 20:17:25.478761 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr" (OuterVolumeSpecName: "kube-api-access-twgrr") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "kube-api-access-twgrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:25.480462 master-0 kubenswrapper[29252]: I1203 20:17:25.480426 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out" (OuterVolumeSpecName: "config-out") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:17:25.480926 master-0 kubenswrapper[29252]: I1203 20:17:25.480880 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.481230 master-0 kubenswrapper[29252]: I1203 20:17:25.481167 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.490181 master-0 kubenswrapper[29252]: I1203 20:17:25.490057 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:17:25.550078 master-0 kubenswrapper[29252]: I1203 20:17:25.549998 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config" (OuterVolumeSpecName: "web-config") pod "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" (UID: "4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:17:25.573463 master-0 kubenswrapper[29252]: I1203 20:17:25.573420 29252 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573463 master-0 kubenswrapper[29252]: I1203 20:17:25.573459 29252 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573477 29252 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573492 29252 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573504 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twgrr\" (UniqueName: \"kubernetes.io/projected/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-kube-api-access-twgrr\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573517 29252 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573530 29252 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573543 29252 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-config-out\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573554 29252 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-web-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:25.573698 master-0 kubenswrapper[29252]: I1203 20:17:25.573576 29252 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 03 20:17:26.441553 master-0 kubenswrapper[29252]: I1203 20:17:26.441440 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.496939 master-0 kubenswrapper[29252]: I1203 20:17:26.496832 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:26.504944 master-0 kubenswrapper[29252]: I1203 20:17:26.504880 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:26.536875 master-0 kubenswrapper[29252]: I1203 20:17:26.536751 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: E1203 20:17:26.537084 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-metric" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: I1203 20:17:26.537101 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-metric" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: E1203 20:17:26.537115 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="config-reloader" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: I1203 20:17:26.537122 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="config-reloader" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: E1203 20:17:26.537144 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: I1203 20:17:26.537152 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" Dec 03 20:17:26.537160 master-0 kubenswrapper[29252]: E1203 20:17:26.537170 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537178 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: E1203 20:17:26.537191 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="alertmanager" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537199 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="alertmanager" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: E1203 20:17:26.537211 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="init-config-reloader" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537219 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="init-config-reloader" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: E1203 20:17:26.537265 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-web" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537276 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-web" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: E1203 20:17:26.537289 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="prom-label-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537296 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="prom-label-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537458 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="prom-label-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537475 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-web" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537499 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy-metric" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537511 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="kube-rbac-proxy" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537520 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="alertmanager" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537540 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2903de-a51a-415a-80be-9ba79b4e173d" containerName="console" Dec 03 20:17:26.537613 master-0 kubenswrapper[29252]: I1203 20:17:26.537570 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" containerName="config-reloader" Dec 03 20:17:26.541274 master-0 kubenswrapper[29252]: I1203 20:17:26.541197 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.543052 master-0 kubenswrapper[29252]: I1203 20:17:26.542999 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 20:17:26.543502 master-0 kubenswrapper[29252]: I1203 20:17:26.543440 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 20:17:26.543715 master-0 kubenswrapper[29252]: I1203 20:17:26.543650 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 20:17:26.543878 master-0 kubenswrapper[29252]: I1203 20:17:26.543652 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 20:17:26.543958 master-0 kubenswrapper[29252]: I1203 20:17:26.543452 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 20:17:26.544396 master-0 kubenswrapper[29252]: I1203 20:17:26.544347 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-kfnzv" Dec 03 20:17:26.544469 master-0 kubenswrapper[29252]: I1203 20:17:26.544423 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 20:17:26.547440 master-0 kubenswrapper[29252]: I1203 20:17:26.547383 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 20:17:26.557820 master-0 kubenswrapper[29252]: I1203 20:17:26.557762 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 20:17:26.561112 master-0 kubenswrapper[29252]: I1203 20:17:26.561056 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:26.588175 master-0 kubenswrapper[29252]: I1203 20:17:26.588124 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588175 master-0 kubenswrapper[29252]: I1203 20:17:26.588176 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588426 master-0 kubenswrapper[29252]: I1203 20:17:26.588203 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588426 master-0 kubenswrapper[29252]: I1203 20:17:26.588224 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588426 master-0 kubenswrapper[29252]: I1203 20:17:26.588242 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588426 master-0 kubenswrapper[29252]: I1203 20:17:26.588265 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588710 master-0 kubenswrapper[29252]: I1203 20:17:26.588568 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588710 master-0 kubenswrapper[29252]: I1203 20:17:26.588623 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c28zn\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-kube-api-access-c28zn\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588917 master-0 kubenswrapper[29252]: I1203 20:17:26.588727 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.588917 master-0 kubenswrapper[29252]: I1203 20:17:26.588807 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.589140 master-0 kubenswrapper[29252]: I1203 20:17:26.589108 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.589249 master-0 kubenswrapper[29252]: I1203 20:17:26.589164 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690638 master-0 kubenswrapper[29252]: I1203 20:17:26.690555 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690645 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690695 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690726 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690768 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690848 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.690912 master-0 kubenswrapper[29252]: I1203 20:17:26.690878 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.691185 master-0 kubenswrapper[29252]: I1203 20:17:26.690922 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.691185 master-0 kubenswrapper[29252]: I1203 20:17:26.691029 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.691185 master-0 kubenswrapper[29252]: I1203 20:17:26.691069 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c28zn\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-kube-api-access-c28zn\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.691185 master-0 kubenswrapper[29252]: I1203 20:17:26.691109 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.691185 master-0 kubenswrapper[29252]: I1203 20:17:26.691141 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.692225 master-0 kubenswrapper[29252]: I1203 20:17:26.692066 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.692333 master-0 kubenswrapper[29252]: I1203 20:17:26.692286 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.692449 master-0 kubenswrapper[29252]: I1203 20:17:26.692412 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.694301 master-0 kubenswrapper[29252]: I1203 20:17:26.694250 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.694899 master-0 kubenswrapper[29252]: I1203 20:17:26.694856 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.694960 master-0 kubenswrapper[29252]: I1203 20:17:26.694919 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-out\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.695017 master-0 kubenswrapper[29252]: I1203 20:17:26.694980 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-web-config\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.695361 master-0 kubenswrapper[29252]: I1203 20:17:26.695324 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-config-volume\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.695985 master-0 kubenswrapper[29252]: I1203 20:17:26.695933 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.696643 master-0 kubenswrapper[29252]: I1203 20:17:26.696604 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.698502 master-0 kubenswrapper[29252]: I1203 20:17:26.698454 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.711930 master-0 kubenswrapper[29252]: I1203 20:17:26.711852 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c28zn\" (UniqueName: \"kubernetes.io/projected/208195c1-0fbf-4721-96bb-fcd9e1c0bc8f-kube-api-access-c28zn\") pod \"alertmanager-main-0\" (UID: \"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:26.936911 master-0 kubenswrapper[29252]: I1203 20:17:26.936770 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 20:17:27.428745 master-0 kubenswrapper[29252]: I1203 20:17:27.428685 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d" path="/var/lib/kubelet/pods/4a6fb6ec-be7b-4987-a6dd-51ccb45e2b1d/volumes" Dec 03 20:17:27.433009 master-0 kubenswrapper[29252]: I1203 20:17:27.432948 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 20:17:27.443167 master-0 kubenswrapper[29252]: W1203 20:17:27.443114 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208195c1_0fbf_4721_96bb_fcd9e1c0bc8f.slice/crio-d471c6d139aea20e02fe0f9dc5ecf53b3b31df251f010d6317001e3c75f6bb91 WatchSource:0}: Error finding container d471c6d139aea20e02fe0f9dc5ecf53b3b31df251f010d6317001e3c75f6bb91: Status 404 returned error can't find the container with id d471c6d139aea20e02fe0f9dc5ecf53b3b31df251f010d6317001e3c75f6bb91 Dec 03 20:17:27.457014 master-0 kubenswrapper[29252]: I1203 20:17:27.456958 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"d471c6d139aea20e02fe0f9dc5ecf53b3b31df251f010d6317001e3c75f6bb91"} Dec 03 20:17:28.466160 master-0 kubenswrapper[29252]: I1203 20:17:28.466080 29252 generic.go:334] "Generic (PLEG): container finished" podID="208195c1-0fbf-4721-96bb-fcd9e1c0bc8f" containerID="dc7f8aa11806e2ac883c7ce1cbad21ee3bc875a926415fca26550760caec2900" exitCode=0 Dec 03 20:17:28.466160 master-0 kubenswrapper[29252]: I1203 20:17:28.466132 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerDied","Data":"dc7f8aa11806e2ac883c7ce1cbad21ee3bc875a926415fca26550760caec2900"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.478976 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"8706f745c30fd5ae3bea2a6eae13e5a0fdb1a02a6a5dfd1abcfd5ecaefb03362"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.479030 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"ea13414feb610702a1972d8fdedda53c310563c05a43cec4f67b9ced9b2bd041"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.479040 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"da61d07591993e37a97c86e0659cae4fd7e36c724825f07fd66df3a3ee0e6541"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.479050 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"bcd07b7850a5e0785e14fb0c4d40a3d00051f0b4f47536ab2e66044cbfd82b50"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.479065 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"060ecfbe1a813dacc24cef9eee99bfdcdd2e696d5d861fdac0cdc570797f95b4"} Dec 03 20:17:29.479928 master-0 kubenswrapper[29252]: I1203 20:17:29.479073 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"208195c1-0fbf-4721-96bb-fcd9e1c0bc8f","Type":"ContainerStarted","Data":"e46fe60b49c71dbe29918b150f92f25ccfbeb76ffff279dc33676b1799b0e67f"} Dec 03 20:17:29.520520 master-0 kubenswrapper[29252]: I1203 20:17:29.520404 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.520372841 podStartE2EDuration="3.520372841s" podCreationTimestamp="2025-12-03 20:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:17:29.514181239 +0000 UTC m=+484.327726222" watchObservedRunningTime="2025-12-03 20:17:29.520372841 +0000 UTC m=+484.333917824" Dec 03 20:18:07.734369 master-0 kubenswrapper[29252]: I1203 20:18:07.734273 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:18:07.786828 master-0 kubenswrapper[29252]: I1203 20:18:07.786763 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:18:07.884953 master-0 kubenswrapper[29252]: I1203 20:18:07.884896 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 20:18:13.324804 master-0 kubenswrapper[29252]: I1203 20:18:13.324695 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-retry-1-master-0"] Dec 03 20:18:13.325899 master-0 kubenswrapper[29252]: I1203 20:18:13.325859 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.328355 master-0 kubenswrapper[29252]: I1203 20:18:13.328319 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 20:18:13.328524 master-0 kubenswrapper[29252]: I1203 20:18:13.328505 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nmjr4" Dec 03 20:18:13.343641 master-0 kubenswrapper[29252]: I1203 20:18:13.343596 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-retry-1-master-0"] Dec 03 20:18:13.371975 master-0 kubenswrapper[29252]: I1203 20:18:13.371923 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.372186 master-0 kubenswrapper[29252]: I1203 20:18:13.372027 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.372186 master-0 kubenswrapper[29252]: I1203 20:18:13.372090 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.473733 master-0 kubenswrapper[29252]: I1203 20:18:13.473569 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.473733 master-0 kubenswrapper[29252]: I1203 20:18:13.473665 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.473733 master-0 kubenswrapper[29252]: I1203 20:18:13.473736 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.474224 master-0 kubenswrapper[29252]: I1203 20:18:13.473832 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.474224 master-0 kubenswrapper[29252]: I1203 20:18:13.473932 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.513838 master-0 kubenswrapper[29252]: I1203 20:18:13.513745 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:13.663819 master-0 kubenswrapper[29252]: I1203 20:18:13.663681 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:14.130649 master-0 kubenswrapper[29252]: I1203 20:18:14.130592 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-retry-1-master-0"] Dec 03 20:18:14.901095 master-0 kubenswrapper[29252]: I1203 20:18:14.901010 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" event={"ID":"100fda0d-de2e-42fa-8c3a-62ec03c956ef","Type":"ContainerStarted","Data":"c063818e4cc6e95137e29d5022447d420f8269053f54f36394f878b40c619306"} Dec 03 20:18:14.901095 master-0 kubenswrapper[29252]: I1203 20:18:14.901068 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" event={"ID":"100fda0d-de2e-42fa-8c3a-62ec03c956ef","Type":"ContainerStarted","Data":"7f87f0dff6e23cde9eaddcf55b70afd2b3165a9ff11f0d0b6ed5af2d9e7c5dcc"} Dec 03 20:18:14.932241 master-0 kubenswrapper[29252]: I1203 20:18:14.932101 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" podStartSLOduration=1.932079904 podStartE2EDuration="1.932079904s" podCreationTimestamp="2025-12-03 20:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:18:14.925175416 +0000 UTC m=+529.738720469" watchObservedRunningTime="2025-12-03 20:18:14.932079904 +0000 UTC m=+529.745624897" Dec 03 20:18:47.377202 master-0 kubenswrapper[29252]: I1203 20:18:47.377114 29252 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:18:47.378182 master-0 kubenswrapper[29252]: I1203 20:18:47.377576 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" containerName="cluster-policy-controller" containerID="cri-o://23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" gracePeriod=30 Dec 03 20:18:47.378182 master-0 kubenswrapper[29252]: I1203 20:18:47.377928 29252 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378046 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" gracePeriod=30 Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378234 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-recovery-controller" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378252 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-recovery-controller" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378274 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-cert-syncer" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378280 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-cert-syncer" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378294 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378301 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378322 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378327 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378343 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378348 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: E1203 20:18:47.378358 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb0810126310d28fb5532674012978b" containerName="cluster-policy-controller" Dec 03 20:18:47.378360 master-0 kubenswrapper[29252]: I1203 20:18:47.378363 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb0810126310d28fb5532674012978b" containerName="cluster-policy-controller" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378345 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" gracePeriod=30 Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378487 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-recovery-controller" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378505 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager-cert-syncer" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378517 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378533 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378546 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="cluster-policy-controller" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378556 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" Dec 03 20:18:47.379488 master-0 kubenswrapper[29252]: I1203 20:18:47.378616 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6fb0810126310d28fb5532674012978b" containerName="kube-controller-manager" containerID="cri-o://e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" gracePeriod=30 Dec 03 20:18:47.567679 master-0 kubenswrapper[29252]: I1203 20:18:47.567476 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.567679 master-0 kubenswrapper[29252]: I1203 20:18:47.567587 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.668803 master-0 kubenswrapper[29252]: I1203 20:18:47.668723 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.669007 master-0 kubenswrapper[29252]: I1203 20:18:47.668832 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.669007 master-0 kubenswrapper[29252]: I1203 20:18:47.668861 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.669007 master-0 kubenswrapper[29252]: I1203 20:18:47.668932 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1b038ad82dbedfe0a11fa0cb80bdd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ccb1b038ad82dbedfe0a11fa0cb80bdd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.739327 master-0 kubenswrapper[29252]: I1203 20:18:47.739276 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:18:47.740632 master-0 kubenswrapper[29252]: I1203 20:18:47.740606 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager-cert-syncer/0.log" Dec 03 20:18:47.741337 master-0 kubenswrapper[29252]: I1203 20:18:47.741282 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:47.745770 master-0 kubenswrapper[29252]: I1203 20:18:47.745692 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="6fb0810126310d28fb5532674012978b" podUID="ccb1b038ad82dbedfe0a11fa0cb80bdd" Dec 03 20:18:47.871609 master-0 kubenswrapper[29252]: I1203 20:18:47.871515 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir\") pod \"6fb0810126310d28fb5532674012978b\" (UID: \"6fb0810126310d28fb5532674012978b\") " Dec 03 20:18:47.871856 master-0 kubenswrapper[29252]: I1203 20:18:47.871698 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir\") pod \"6fb0810126310d28fb5532674012978b\" (UID: \"6fb0810126310d28fb5532674012978b\") " Dec 03 20:18:47.872400 master-0 kubenswrapper[29252]: I1203 20:18:47.872020 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "6fb0810126310d28fb5532674012978b" (UID: "6fb0810126310d28fb5532674012978b"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:18:47.872400 master-0 kubenswrapper[29252]: I1203 20:18:47.872099 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "6fb0810126310d28fb5532674012978b" (UID: "6fb0810126310d28fb5532674012978b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:18:47.872400 master-0 kubenswrapper[29252]: I1203 20:18:47.872350 29252 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:18:47.872400 master-0 kubenswrapper[29252]: I1203 20:18:47.872368 29252 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/6fb0810126310d28fb5532674012978b-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:18:48.231468 master-0 kubenswrapper[29252]: I1203 20:18:48.231404 29252 generic.go:334] "Generic (PLEG): container finished" podID="100fda0d-de2e-42fa-8c3a-62ec03c956ef" containerID="c063818e4cc6e95137e29d5022447d420f8269053f54f36394f878b40c619306" exitCode=0 Dec 03 20:18:48.231839 master-0 kubenswrapper[29252]: I1203 20:18:48.231492 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" event={"ID":"100fda0d-de2e-42fa-8c3a-62ec03c956ef","Type":"ContainerDied","Data":"c063818e4cc6e95137e29d5022447d420f8269053f54f36394f878b40c619306"} Dec 03 20:18:48.234505 master-0 kubenswrapper[29252]: I1203 20:18:48.234437 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager/1.log" Dec 03 20:18:48.235822 master-0 kubenswrapper[29252]: I1203 20:18:48.235710 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_6fb0810126310d28fb5532674012978b/kube-controller-manager-cert-syncer/0.log" Dec 03 20:18:48.236366 master-0 kubenswrapper[29252]: I1203 20:18:48.236311 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" exitCode=0 Dec 03 20:18:48.236366 master-0 kubenswrapper[29252]: I1203 20:18:48.236352 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" exitCode=0 Dec 03 20:18:48.236366 master-0 kubenswrapper[29252]: I1203 20:18:48.236361 29252 scope.go:117] "RemoveContainer" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.236366 master-0 kubenswrapper[29252]: I1203 20:18:48.236367 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" exitCode=2 Dec 03 20:18:48.236814 master-0 kubenswrapper[29252]: I1203 20:18:48.236389 29252 generic.go:334] "Generic (PLEG): container finished" podID="6fb0810126310d28fb5532674012978b" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" exitCode=0 Dec 03 20:18:48.236814 master-0 kubenswrapper[29252]: I1203 20:18:48.236409 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:18:48.259526 master-0 kubenswrapper[29252]: I1203 20:18:48.259417 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="6fb0810126310d28fb5532674012978b" podUID="ccb1b038ad82dbedfe0a11fa0cb80bdd" Dec 03 20:18:48.263967 master-0 kubenswrapper[29252]: I1203 20:18:48.263920 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.265846 master-0 kubenswrapper[29252]: I1203 20:18:48.265761 29252 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="6fb0810126310d28fb5532674012978b" podUID="ccb1b038ad82dbedfe0a11fa0cb80bdd" Dec 03 20:18:48.288153 master-0 kubenswrapper[29252]: I1203 20:18:48.288108 29252 scope.go:117] "RemoveContainer" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.305437 master-0 kubenswrapper[29252]: I1203 20:18:48.305390 29252 scope.go:117] "RemoveContainer" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.321064 master-0 kubenswrapper[29252]: I1203 20:18:48.321016 29252 scope.go:117] "RemoveContainer" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.338277 master-0 kubenswrapper[29252]: I1203 20:18:48.338231 29252 scope.go:117] "RemoveContainer" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.338682 master-0 kubenswrapper[29252]: E1203 20:18:48.338640 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": container with ID starting with e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046 not found: ID does not exist" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.338755 master-0 kubenswrapper[29252]: I1203 20:18:48.338696 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046"} err="failed to get container status \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": rpc error: code = NotFound desc = could not find container \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": container with ID starting with e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046 not found: ID does not exist" Dec 03 20:18:48.338755 master-0 kubenswrapper[29252]: I1203 20:18:48.338717 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.339191 master-0 kubenswrapper[29252]: E1203 20:18:48.339133 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": container with ID starting with 1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98 not found: ID does not exist" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.339267 master-0 kubenswrapper[29252]: I1203 20:18:48.339190 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} err="failed to get container status \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": rpc error: code = NotFound desc = could not find container \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": container with ID starting with 1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98 not found: ID does not exist" Dec 03 20:18:48.339267 master-0 kubenswrapper[29252]: I1203 20:18:48.339231 29252 scope.go:117] "RemoveContainer" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.339560 master-0 kubenswrapper[29252]: E1203 20:18:48.339526 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": container with ID starting with dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea not found: ID does not exist" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.339560 master-0 kubenswrapper[29252]: I1203 20:18:48.339552 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea"} err="failed to get container status \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": rpc error: code = NotFound desc = could not find container \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": container with ID starting with dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea not found: ID does not exist" Dec 03 20:18:48.339667 master-0 kubenswrapper[29252]: I1203 20:18:48.339567 29252 scope.go:117] "RemoveContainer" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.339930 master-0 kubenswrapper[29252]: E1203 20:18:48.339875 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": container with ID starting with 47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb not found: ID does not exist" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.340004 master-0 kubenswrapper[29252]: I1203 20:18:48.339934 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb"} err="failed to get container status \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": rpc error: code = NotFound desc = could not find container \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": container with ID starting with 47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb not found: ID does not exist" Dec 03 20:18:48.340004 master-0 kubenswrapper[29252]: I1203 20:18:48.339968 29252 scope.go:117] "RemoveContainer" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.340358 master-0 kubenswrapper[29252]: E1203 20:18:48.340308 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": container with ID starting with 23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5 not found: ID does not exist" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.340358 master-0 kubenswrapper[29252]: I1203 20:18:48.340343 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5"} err="failed to get container status \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": rpc error: code = NotFound desc = could not find container \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": container with ID starting with 23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5 not found: ID does not exist" Dec 03 20:18:48.340512 master-0 kubenswrapper[29252]: I1203 20:18:48.340365 29252 scope.go:117] "RemoveContainer" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.340664 master-0 kubenswrapper[29252]: I1203 20:18:48.340623 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046"} err="failed to get container status \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": rpc error: code = NotFound desc = could not find container \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": container with ID starting with e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046 not found: ID does not exist" Dec 03 20:18:48.340664 master-0 kubenswrapper[29252]: I1203 20:18:48.340649 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.340961 master-0 kubenswrapper[29252]: I1203 20:18:48.340925 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} err="failed to get container status \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": rpc error: code = NotFound desc = could not find container \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": container with ID starting with 1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98 not found: ID does not exist" Dec 03 20:18:48.340961 master-0 kubenswrapper[29252]: I1203 20:18:48.340946 29252 scope.go:117] "RemoveContainer" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.341273 master-0 kubenswrapper[29252]: I1203 20:18:48.341227 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea"} err="failed to get container status \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": rpc error: code = NotFound desc = could not find container \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": container with ID starting with dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea not found: ID does not exist" Dec 03 20:18:48.341273 master-0 kubenswrapper[29252]: I1203 20:18:48.341264 29252 scope.go:117] "RemoveContainer" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.341629 master-0 kubenswrapper[29252]: I1203 20:18:48.341583 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb"} err="failed to get container status \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": rpc error: code = NotFound desc = could not find container \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": container with ID starting with 47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb not found: ID does not exist" Dec 03 20:18:48.341629 master-0 kubenswrapper[29252]: I1203 20:18:48.341612 29252 scope.go:117] "RemoveContainer" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.341952 master-0 kubenswrapper[29252]: I1203 20:18:48.341891 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5"} err="failed to get container status \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": rpc error: code = NotFound desc = could not find container \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": container with ID starting with 23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5 not found: ID does not exist" Dec 03 20:18:48.341952 master-0 kubenswrapper[29252]: I1203 20:18:48.341947 29252 scope.go:117] "RemoveContainer" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.342282 master-0 kubenswrapper[29252]: I1203 20:18:48.342242 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046"} err="failed to get container status \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": rpc error: code = NotFound desc = could not find container \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": container with ID starting with e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046 not found: ID does not exist" Dec 03 20:18:48.342282 master-0 kubenswrapper[29252]: I1203 20:18:48.342271 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.342597 master-0 kubenswrapper[29252]: I1203 20:18:48.342555 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} err="failed to get container status \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": rpc error: code = NotFound desc = could not find container \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": container with ID starting with 1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98 not found: ID does not exist" Dec 03 20:18:48.342597 master-0 kubenswrapper[29252]: I1203 20:18:48.342594 29252 scope.go:117] "RemoveContainer" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.342932 master-0 kubenswrapper[29252]: I1203 20:18:48.342899 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea"} err="failed to get container status \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": rpc error: code = NotFound desc = could not find container \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": container with ID starting with dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea not found: ID does not exist" Dec 03 20:18:48.342995 master-0 kubenswrapper[29252]: I1203 20:18:48.342932 29252 scope.go:117] "RemoveContainer" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.343758 master-0 kubenswrapper[29252]: I1203 20:18:48.343710 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb"} err="failed to get container status \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": rpc error: code = NotFound desc = could not find container \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": container with ID starting with 47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb not found: ID does not exist" Dec 03 20:18:48.343840 master-0 kubenswrapper[29252]: I1203 20:18:48.343801 29252 scope.go:117] "RemoveContainer" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.344639 master-0 kubenswrapper[29252]: I1203 20:18:48.344567 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5"} err="failed to get container status \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": rpc error: code = NotFound desc = could not find container \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": container with ID starting with 23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5 not found: ID does not exist" Dec 03 20:18:48.344639 master-0 kubenswrapper[29252]: I1203 20:18:48.344627 29252 scope.go:117] "RemoveContainer" containerID="e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046" Dec 03 20:18:48.345028 master-0 kubenswrapper[29252]: I1203 20:18:48.344985 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046"} err="failed to get container status \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": rpc error: code = NotFound desc = could not find container \"e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046\": container with ID starting with e3d48bbebde0e3be31d9b419dead9aee6bd4e55327fcc43462432903c757f046 not found: ID does not exist" Dec 03 20:18:48.345028 master-0 kubenswrapper[29252]: I1203 20:18:48.345017 29252 scope.go:117] "RemoveContainer" containerID="1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98" Dec 03 20:18:48.345379 master-0 kubenswrapper[29252]: I1203 20:18:48.345314 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98"} err="failed to get container status \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": rpc error: code = NotFound desc = could not find container \"1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98\": container with ID starting with 1022b071afdcc488e5b811897fc61341418fd8cb539a3cb33cdf5eb865dd8c98 not found: ID does not exist" Dec 03 20:18:48.345437 master-0 kubenswrapper[29252]: I1203 20:18:48.345375 29252 scope.go:117] "RemoveContainer" containerID="dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea" Dec 03 20:18:48.345771 master-0 kubenswrapper[29252]: I1203 20:18:48.345727 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea"} err="failed to get container status \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": rpc error: code = NotFound desc = could not find container \"dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea\": container with ID starting with dac209c872cd342ca3d91a1360393c6a49075c71538398e352e2b9bd76cf6cea not found: ID does not exist" Dec 03 20:18:48.345771 master-0 kubenswrapper[29252]: I1203 20:18:48.345758 29252 scope.go:117] "RemoveContainer" containerID="47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb" Dec 03 20:18:48.346941 master-0 kubenswrapper[29252]: I1203 20:18:48.346889 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb"} err="failed to get container status \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": rpc error: code = NotFound desc = could not find container \"47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb\": container with ID starting with 47037c534ad124219bae3512db8b36cae596ddff0fbbf455ddfccf22a4a0becb not found: ID does not exist" Dec 03 20:18:48.346941 master-0 kubenswrapper[29252]: I1203 20:18:48.346931 29252 scope.go:117] "RemoveContainer" containerID="23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5" Dec 03 20:18:48.347301 master-0 kubenswrapper[29252]: I1203 20:18:48.347252 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5"} err="failed to get container status \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": rpc error: code = NotFound desc = could not find container \"23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5\": container with ID starting with 23f5bb0a4826d2606b05310bc893b7229ab016de0f2d6e0f05c6a59da0a83bf5 not found: ID does not exist" Dec 03 20:18:49.426441 master-0 kubenswrapper[29252]: I1203 20:18:49.426210 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb0810126310d28fb5532674012978b" path="/var/lib/kubelet/pods/6fb0810126310d28fb5532674012978b/volumes" Dec 03 20:18:49.563085 master-0 kubenswrapper[29252]: I1203 20:18:49.562971 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:18:49.604264 master-0 kubenswrapper[29252]: I1203 20:18:49.604195 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock\") pod \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " Dec 03 20:18:49.604670 master-0 kubenswrapper[29252]: I1203 20:18:49.604384 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock" (OuterVolumeSpecName: "var-lock") pod "100fda0d-de2e-42fa-8c3a-62ec03c956ef" (UID: "100fda0d-de2e-42fa-8c3a-62ec03c956ef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:18:49.604670 master-0 kubenswrapper[29252]: I1203 20:18:49.604446 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access\") pod \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " Dec 03 20:18:49.604670 master-0 kubenswrapper[29252]: I1203 20:18:49.604602 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir\") pod \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\" (UID: \"100fda0d-de2e-42fa-8c3a-62ec03c956ef\") " Dec 03 20:18:49.604855 master-0 kubenswrapper[29252]: I1203 20:18:49.604751 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "100fda0d-de2e-42fa-8c3a-62ec03c956ef" (UID: "100fda0d-de2e-42fa-8c3a-62ec03c956ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 20:18:49.605282 master-0 kubenswrapper[29252]: I1203 20:18:49.605252 29252 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 20:18:49.605329 master-0 kubenswrapper[29252]: I1203 20:18:49.605284 29252 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 20:18:49.610659 master-0 kubenswrapper[29252]: I1203 20:18:49.610594 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "100fda0d-de2e-42fa-8c3a-62ec03c956ef" (UID: "100fda0d-de2e-42fa-8c3a-62ec03c956ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:18:49.707288 master-0 kubenswrapper[29252]: I1203 20:18:49.707225 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/100fda0d-de2e-42fa-8c3a-62ec03c956ef-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 20:18:50.255272 master-0 kubenswrapper[29252]: I1203 20:18:50.255201 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" event={"ID":"100fda0d-de2e-42fa-8c3a-62ec03c956ef","Type":"ContainerDied","Data":"7f87f0dff6e23cde9eaddcf55b70afd2b3165a9ff11f0d0b6ed5af2d9e7c5dcc"} Dec 03 20:18:50.255517 master-0 kubenswrapper[29252]: I1203 20:18:50.255287 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f87f0dff6e23cde9eaddcf55b70afd2b3165a9ff11f0d0b6ed5af2d9e7c5dcc" Dec 03 20:18:50.255517 master-0 kubenswrapper[29252]: I1203 20:18:50.255218 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-retry-1-master-0" Dec 03 20:19:02.416335 master-0 kubenswrapper[29252]: I1203 20:19:02.416265 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:02.432853 master-0 kubenswrapper[29252]: I1203 20:19:02.432802 29252 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="8e1759dd-5bb3-4112-9eca-cf9f64a56c9e" Dec 03 20:19:02.432853 master-0 kubenswrapper[29252]: I1203 20:19:02.432845 29252 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="8e1759dd-5bb3-4112-9eca-cf9f64a56c9e" Dec 03 20:19:02.449746 master-0 kubenswrapper[29252]: I1203 20:19:02.449677 29252 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:02.452703 master-0 kubenswrapper[29252]: I1203 20:19:02.452144 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:19:02.458284 master-0 kubenswrapper[29252]: I1203 20:19:02.458231 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:19:02.470805 master-0 kubenswrapper[29252]: I1203 20:19:02.470715 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:02.471162 master-0 kubenswrapper[29252]: I1203 20:19:02.471120 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 20:19:02.498196 master-0 kubenswrapper[29252]: W1203 20:19:02.498025 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb1b038ad82dbedfe0a11fa0cb80bdd.slice/crio-f107faab5c858bf63abc2e4497bd596163b5868d6f76ee94f0f683a2617c2bd8 WatchSource:0}: Error finding container f107faab5c858bf63abc2e4497bd596163b5868d6f76ee94f0f683a2617c2bd8: Status 404 returned error can't find the container with id f107faab5c858bf63abc2e4497bd596163b5868d6f76ee94f0f683a2617c2bd8 Dec 03 20:19:03.373059 master-0 kubenswrapper[29252]: I1203 20:19:03.372981 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ccb1b038ad82dbedfe0a11fa0cb80bdd","Type":"ContainerStarted","Data":"1e3fa8c62241367ed56ec95eb9d489a293b5ced149acbc0cb9a4d0c329eae37e"} Dec 03 20:19:03.373059 master-0 kubenswrapper[29252]: I1203 20:19:03.373040 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ccb1b038ad82dbedfe0a11fa0cb80bdd","Type":"ContainerStarted","Data":"7932380785b42f9a85d0433dc403ca794ecece4b270f44be9af1ce59b3b09a6e"} Dec 03 20:19:03.373059 master-0 kubenswrapper[29252]: I1203 20:19:03.373054 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ccb1b038ad82dbedfe0a11fa0cb80bdd","Type":"ContainerStarted","Data":"349c3271099e5b148c579f1a0fd7cd089cfb56b1acac9af5279286bc003367d9"} Dec 03 20:19:03.373059 master-0 kubenswrapper[29252]: I1203 20:19:03.373066 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ccb1b038ad82dbedfe0a11fa0cb80bdd","Type":"ContainerStarted","Data":"f107faab5c858bf63abc2e4497bd596163b5868d6f76ee94f0f683a2617c2bd8"} Dec 03 20:19:04.382367 master-0 kubenswrapper[29252]: I1203 20:19:04.382302 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ccb1b038ad82dbedfe0a11fa0cb80bdd","Type":"ContainerStarted","Data":"f96d1b6677e9113c783e1f4a357b3ab0ba93e92043f14370879ba7c4f1b77446"} Dec 03 20:19:04.408124 master-0 kubenswrapper[29252]: I1203 20:19:04.408040 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.408024831 podStartE2EDuration="2.408024831s" podCreationTimestamp="2025-12-03 20:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:19:04.402587908 +0000 UTC m=+579.216132871" watchObservedRunningTime="2025-12-03 20:19:04.408024831 +0000 UTC m=+579.221569774" Dec 03 20:19:12.471888 master-0 kubenswrapper[29252]: I1203 20:19:12.471735 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:12.471888 master-0 kubenswrapper[29252]: I1203 20:19:12.471824 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:12.471888 master-0 kubenswrapper[29252]: I1203 20:19:12.471846 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:12.471888 master-0 kubenswrapper[29252]: I1203 20:19:12.471864 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:12.479182 master-0 kubenswrapper[29252]: I1203 20:19:12.479121 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:12.479407 master-0 kubenswrapper[29252]: I1203 20:19:12.479245 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:13.466249 master-0 kubenswrapper[29252]: I1203 20:19:13.466127 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:13.467327 master-0 kubenswrapper[29252]: I1203 20:19:13.467251 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 20:19:15.756633 master-0 kubenswrapper[29252]: I1203 20:19:15.756276 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:15.759279 master-0 kubenswrapper[29252]: E1203 20:19:15.756648 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100fda0d-de2e-42fa-8c3a-62ec03c956ef" containerName="installer" Dec 03 20:19:15.759279 master-0 kubenswrapper[29252]: I1203 20:19:15.756673 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="100fda0d-de2e-42fa-8c3a-62ec03c956ef" containerName="installer" Dec 03 20:19:15.759279 master-0 kubenswrapper[29252]: I1203 20:19:15.757285 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="100fda0d-de2e-42fa-8c3a-62ec03c956ef" containerName="installer" Dec 03 20:19:15.759279 master-0 kubenswrapper[29252]: I1203 20:19:15.758568 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.788263 master-0 kubenswrapper[29252]: I1203 20:19:15.788194 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:15.855187 master-0 kubenswrapper[29252]: I1203 20:19:15.854641 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4qb\" (UniqueName: \"kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.855187 master-0 kubenswrapper[29252]: I1203 20:19:15.854744 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.855187 master-0 kubenswrapper[29252]: I1203 20:19:15.854862 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.956094 master-0 kubenswrapper[29252]: I1203 20:19:15.956035 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4qb\" (UniqueName: \"kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.956094 master-0 kubenswrapper[29252]: I1203 20:19:15.956088 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.956404 master-0 kubenswrapper[29252]: I1203 20:19:15.956135 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.956721 master-0 kubenswrapper[29252]: I1203 20:19:15.956662 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.956967 master-0 kubenswrapper[29252]: I1203 20:19:15.956911 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:15.973170 master-0 kubenswrapper[29252]: I1203 20:19:15.973121 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4qb\" (UniqueName: \"kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb\") pod \"community-operators-pvg4r\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:16.088440 master-0 kubenswrapper[29252]: I1203 20:19:16.088313 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:16.587932 master-0 kubenswrapper[29252]: I1203 20:19:16.587877 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:16.595077 master-0 kubenswrapper[29252]: W1203 20:19:16.595023 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1845932f_38ee_42d4_b7ed_9598a45f5529.slice/crio-85705209a38b758e534f72062dbd6ed9d2f5d32061e5ec02ec314d2856c3525e WatchSource:0}: Error finding container 85705209a38b758e534f72062dbd6ed9d2f5d32061e5ec02ec314d2856c3525e: Status 404 returned error can't find the container with id 85705209a38b758e534f72062dbd6ed9d2f5d32061e5ec02ec314d2856c3525e Dec 03 20:19:17.491694 master-0 kubenswrapper[29252]: I1203 20:19:17.491613 29252 generic.go:334] "Generic (PLEG): container finished" podID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerID="aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19" exitCode=0 Dec 03 20:19:17.491694 master-0 kubenswrapper[29252]: I1203 20:19:17.491661 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerDied","Data":"aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19"} Dec 03 20:19:17.491694 master-0 kubenswrapper[29252]: I1203 20:19:17.491686 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerStarted","Data":"85705209a38b758e534f72062dbd6ed9d2f5d32061e5ec02ec314d2856c3525e"} Dec 03 20:19:18.508300 master-0 kubenswrapper[29252]: I1203 20:19:18.508167 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerStarted","Data":"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8"} Dec 03 20:19:19.518564 master-0 kubenswrapper[29252]: I1203 20:19:19.518513 29252 generic.go:334] "Generic (PLEG): container finished" podID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerID="47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8" exitCode=0 Dec 03 20:19:19.519189 master-0 kubenswrapper[29252]: I1203 20:19:19.518573 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerDied","Data":"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8"} Dec 03 20:19:19.519276 master-0 kubenswrapper[29252]: I1203 20:19:19.519262 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerStarted","Data":"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c"} Dec 03 20:19:19.545894 master-0 kubenswrapper[29252]: I1203 20:19:19.545806 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pvg4r" podStartSLOduration=3.003840273 podStartE2EDuration="4.545764478s" podCreationTimestamp="2025-12-03 20:19:15 +0000 UTC" firstStartedPulling="2025-12-03 20:19:17.493694849 +0000 UTC m=+592.307239802" lastFinishedPulling="2025-12-03 20:19:19.035619054 +0000 UTC m=+593.849164007" observedRunningTime="2025-12-03 20:19:19.539633749 +0000 UTC m=+594.353178712" watchObservedRunningTime="2025-12-03 20:19:19.545764478 +0000 UTC m=+594.359309431" Dec 03 20:19:20.645189 master-0 kubenswrapper[29252]: I1203 20:19:20.645099 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-p4zp5"] Dec 03 20:19:20.646055 master-0 kubenswrapper[29252]: I1203 20:19:20.646029 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.648010 master-0 kubenswrapper[29252]: I1203 20:19:20.647956 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Dec 03 20:19:20.648145 master-0 kubenswrapper[29252]: I1203 20:19:20.648120 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Dec 03 20:19:20.648401 master-0 kubenswrapper[29252]: I1203 20:19:20.648365 29252 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Dec 03 20:19:20.651904 master-0 kubenswrapper[29252]: I1203 20:19:20.651852 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Dec 03 20:19:20.655297 master-0 kubenswrapper[29252]: I1203 20:19:20.655246 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-p4zp5"] Dec 03 20:19:20.748005 master-0 kubenswrapper[29252]: I1203 20:19:20.747927 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mstr\" (UniqueName: \"kubernetes.io/projected/d1b07b73-62a5-42e9-9971-f7eca211f897-kube-api-access-8mstr\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.748452 master-0 kubenswrapper[29252]: I1203 20:19:20.748195 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d1b07b73-62a5-42e9-9971-f7eca211f897-os-client-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.748452 master-0 kubenswrapper[29252]: I1203 20:19:20.748313 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/d1b07b73-62a5-42e9-9971-f7eca211f897-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.850137 master-0 kubenswrapper[29252]: I1203 20:19:20.850065 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mstr\" (UniqueName: \"kubernetes.io/projected/d1b07b73-62a5-42e9-9971-f7eca211f897-kube-api-access-8mstr\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.850137 master-0 kubenswrapper[29252]: I1203 20:19:20.850148 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d1b07b73-62a5-42e9-9971-f7eca211f897-os-client-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.850389 master-0 kubenswrapper[29252]: I1203 20:19:20.850317 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/d1b07b73-62a5-42e9-9971-f7eca211f897-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.851429 master-0 kubenswrapper[29252]: I1203 20:19:20.851392 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/d1b07b73-62a5-42e9-9971-f7eca211f897-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.853418 master-0 kubenswrapper[29252]: I1203 20:19:20.853379 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/d1b07b73-62a5-42e9-9971-f7eca211f897-os-client-config\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.866180 master-0 kubenswrapper[29252]: I1203 20:19:20.866124 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mstr\" (UniqueName: \"kubernetes.io/projected/d1b07b73-62a5-42e9-9971-f7eca211f897-kube-api-access-8mstr\") pod \"sushy-emulator-58f4c9b998-p4zp5\" (UID: \"d1b07b73-62a5-42e9-9971-f7eca211f897\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:20.978442 master-0 kubenswrapper[29252]: I1203 20:19:20.978261 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:21.408841 master-0 kubenswrapper[29252]: I1203 20:19:21.408797 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-p4zp5"] Dec 03 20:19:21.411434 master-0 kubenswrapper[29252]: W1203 20:19:21.411357 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b07b73_62a5_42e9_9971_f7eca211f897.slice/crio-cb346e30ff1d84f2bb9241becb0c0c241ec980da5992c61f62e6ef9d618cff68 WatchSource:0}: Error finding container cb346e30ff1d84f2bb9241becb0c0c241ec980da5992c61f62e6ef9d618cff68: Status 404 returned error can't find the container with id cb346e30ff1d84f2bb9241becb0c0c241ec980da5992c61f62e6ef9d618cff68 Dec 03 20:19:21.536056 master-0 kubenswrapper[29252]: I1203 20:19:21.535996 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" event={"ID":"d1b07b73-62a5-42e9-9971-f7eca211f897","Type":"ContainerStarted","Data":"cb346e30ff1d84f2bb9241becb0c0c241ec980da5992c61f62e6ef9d618cff68"} Dec 03 20:19:22.133874 master-0 kubenswrapper[29252]: I1203 20:19:22.133817 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:22.139197 master-0 kubenswrapper[29252]: I1203 20:19:22.137759 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.151722 master-0 kubenswrapper[29252]: I1203 20:19:22.151659 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:22.274210 master-0 kubenswrapper[29252]: I1203 20:19:22.274101 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.274446 master-0 kubenswrapper[29252]: I1203 20:19:22.274322 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.274446 master-0 kubenswrapper[29252]: I1203 20:19:22.274390 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwkx\" (UniqueName: \"kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.381943 master-0 kubenswrapper[29252]: I1203 20:19:22.381231 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.381943 master-0 kubenswrapper[29252]: I1203 20:19:22.381955 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwkx\" (UniqueName: \"kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.382389 master-0 kubenswrapper[29252]: I1203 20:19:22.382324 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.382690 master-0 kubenswrapper[29252]: I1203 20:19:22.382644 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.382690 master-0 kubenswrapper[29252]: I1203 20:19:22.381878 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.403725 master-0 kubenswrapper[29252]: I1203 20:19:22.403611 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwkx\" (UniqueName: \"kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx\") pod \"redhat-marketplace-6c687\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.486118 master-0 kubenswrapper[29252]: I1203 20:19:22.486037 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:22.932498 master-0 kubenswrapper[29252]: I1203 20:19:22.931334 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:22.944273 master-0 kubenswrapper[29252]: W1203 20:19:22.942937 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bed1c0_b8ab_460a_ad2c_3eb5726bbbd2.slice/crio-55ff296616c46221be3071d2633c58f56e5fd9627efa8e739d9d1126363a5a4a WatchSource:0}: Error finding container 55ff296616c46221be3071d2633c58f56e5fd9627efa8e739d9d1126363a5a4a: Status 404 returned error can't find the container with id 55ff296616c46221be3071d2633c58f56e5fd9627efa8e739d9d1126363a5a4a Dec 03 20:19:23.553230 master-0 kubenswrapper[29252]: I1203 20:19:23.553167 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerStarted","Data":"55ff296616c46221be3071d2633c58f56e5fd9627efa8e739d9d1126363a5a4a"} Dec 03 20:19:24.561471 master-0 kubenswrapper[29252]: I1203 20:19:24.561418 29252 generic.go:334] "Generic (PLEG): container finished" podID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerID="1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64" exitCode=0 Dec 03 20:19:24.561471 master-0 kubenswrapper[29252]: I1203 20:19:24.561471 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerDied","Data":"1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64"} Dec 03 20:19:25.072647 master-0 kubenswrapper[29252]: I1203 20:19:25.072000 29252 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 20:19:26.089353 master-0 kubenswrapper[29252]: I1203 20:19:26.089302 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:26.089856 master-0 kubenswrapper[29252]: I1203 20:19:26.089367 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:26.124359 master-0 kubenswrapper[29252]: I1203 20:19:26.124312 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:26.641856 master-0 kubenswrapper[29252]: I1203 20:19:26.641754 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:27.332467 master-0 kubenswrapper[29252]: I1203 20:19:27.332410 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:28.594501 master-0 kubenswrapper[29252]: I1203 20:19:28.594400 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" event={"ID":"d1b07b73-62a5-42e9-9971-f7eca211f897","Type":"ContainerStarted","Data":"feb3a9f319a96ad41f9e26bb66e4bde0203d0b1a60a68adba12d4e3722caaf06"} Dec 03 20:19:28.599413 master-0 kubenswrapper[29252]: I1203 20:19:28.599322 29252 generic.go:334] "Generic (PLEG): container finished" podID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerID="7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7" exitCode=0 Dec 03 20:19:28.599598 master-0 kubenswrapper[29252]: I1203 20:19:28.599416 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerDied","Data":"7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7"} Dec 03 20:19:28.599884 master-0 kubenswrapper[29252]: I1203 20:19:28.599823 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pvg4r" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="registry-server" containerID="cri-o://106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c" gracePeriod=2 Dec 03 20:19:28.627943 master-0 kubenswrapper[29252]: I1203 20:19:28.627862 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" podStartSLOduration=2.107665314 podStartE2EDuration="8.627841435s" podCreationTimestamp="2025-12-03 20:19:20 +0000 UTC" firstStartedPulling="2025-12-03 20:19:21.414549379 +0000 UTC m=+596.228094332" lastFinishedPulling="2025-12-03 20:19:27.93472546 +0000 UTC m=+602.748270453" observedRunningTime="2025-12-03 20:19:28.620068804 +0000 UTC m=+603.433613797" watchObservedRunningTime="2025-12-03 20:19:28.627841435 +0000 UTC m=+603.441386408" Dec 03 20:19:29.111221 master-0 kubenswrapper[29252]: I1203 20:19:29.111146 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:29.208007 master-0 kubenswrapper[29252]: I1203 20:19:29.207952 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content\") pod \"1845932f-38ee-42d4-b7ed-9598a45f5529\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " Dec 03 20:19:29.208245 master-0 kubenswrapper[29252]: I1203 20:19:29.208034 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4qb\" (UniqueName: \"kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb\") pod \"1845932f-38ee-42d4-b7ed-9598a45f5529\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " Dec 03 20:19:29.208245 master-0 kubenswrapper[29252]: I1203 20:19:29.208098 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities\") pod \"1845932f-38ee-42d4-b7ed-9598a45f5529\" (UID: \"1845932f-38ee-42d4-b7ed-9598a45f5529\") " Dec 03 20:19:29.209112 master-0 kubenswrapper[29252]: I1203 20:19:29.209072 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities" (OuterVolumeSpecName: "utilities") pod "1845932f-38ee-42d4-b7ed-9598a45f5529" (UID: "1845932f-38ee-42d4-b7ed-9598a45f5529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:29.209545 master-0 kubenswrapper[29252]: I1203 20:19:29.209506 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:29.231629 master-0 kubenswrapper[29252]: I1203 20:19:29.231570 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb" (OuterVolumeSpecName: "kube-api-access-rk4qb") pod "1845932f-38ee-42d4-b7ed-9598a45f5529" (UID: "1845932f-38ee-42d4-b7ed-9598a45f5529"). InnerVolumeSpecName "kube-api-access-rk4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:19:29.273010 master-0 kubenswrapper[29252]: I1203 20:19:29.272951 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1845932f-38ee-42d4-b7ed-9598a45f5529" (UID: "1845932f-38ee-42d4-b7ed-9598a45f5529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:29.311660 master-0 kubenswrapper[29252]: I1203 20:19:29.311579 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1845932f-38ee-42d4-b7ed-9598a45f5529-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:29.311660 master-0 kubenswrapper[29252]: I1203 20:19:29.311646 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4qb\" (UniqueName: \"kubernetes.io/projected/1845932f-38ee-42d4-b7ed-9598a45f5529-kube-api-access-rk4qb\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:29.609694 master-0 kubenswrapper[29252]: I1203 20:19:29.609648 29252 generic.go:334] "Generic (PLEG): container finished" podID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerID="106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c" exitCode=0 Dec 03 20:19:29.610338 master-0 kubenswrapper[29252]: I1203 20:19:29.609694 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerDied","Data":"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c"} Dec 03 20:19:29.610456 master-0 kubenswrapper[29252]: I1203 20:19:29.609767 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvg4r" Dec 03 20:19:29.610527 master-0 kubenswrapper[29252]: I1203 20:19:29.610513 29252 scope.go:117] "RemoveContainer" containerID="106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c" Dec 03 20:19:29.610663 master-0 kubenswrapper[29252]: I1203 20:19:29.610433 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvg4r" event={"ID":"1845932f-38ee-42d4-b7ed-9598a45f5529","Type":"ContainerDied","Data":"85705209a38b758e534f72062dbd6ed9d2f5d32061e5ec02ec314d2856c3525e"} Dec 03 20:19:29.614736 master-0 kubenswrapper[29252]: I1203 20:19:29.614713 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerStarted","Data":"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8"} Dec 03 20:19:29.634854 master-0 kubenswrapper[29252]: I1203 20:19:29.634567 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:29.635110 master-0 kubenswrapper[29252]: I1203 20:19:29.635061 29252 scope.go:117] "RemoveContainer" containerID="47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8" Dec 03 20:19:29.640816 master-0 kubenswrapper[29252]: I1203 20:19:29.640735 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pvg4r"] Dec 03 20:19:29.657946 master-0 kubenswrapper[29252]: I1203 20:19:29.657877 29252 scope.go:117] "RemoveContainer" containerID="aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19" Dec 03 20:19:29.658863 master-0 kubenswrapper[29252]: I1203 20:19:29.658682 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6c687" podStartSLOduration=3.721120803 podStartE2EDuration="7.658657911s" podCreationTimestamp="2025-12-03 20:19:22 +0000 UTC" firstStartedPulling="2025-12-03 20:19:25.071901121 +0000 UTC m=+599.885446074" lastFinishedPulling="2025-12-03 20:19:29.009438219 +0000 UTC m=+603.822983182" observedRunningTime="2025-12-03 20:19:29.65409861 +0000 UTC m=+604.467643633" watchObservedRunningTime="2025-12-03 20:19:29.658657911 +0000 UTC m=+604.472202904" Dec 03 20:19:29.685107 master-0 kubenswrapper[29252]: I1203 20:19:29.684958 29252 scope.go:117] "RemoveContainer" containerID="106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c" Dec 03 20:19:29.685760 master-0 kubenswrapper[29252]: E1203 20:19:29.685709 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c\": container with ID starting with 106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c not found: ID does not exist" containerID="106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c" Dec 03 20:19:29.685892 master-0 kubenswrapper[29252]: I1203 20:19:29.685842 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c"} err="failed to get container status \"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c\": rpc error: code = NotFound desc = could not find container \"106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c\": container with ID starting with 106890226fbe1bfa06ec76c8ebe0f58d940888185e1e5dc6f40744711af9d36c not found: ID does not exist" Dec 03 20:19:29.685938 master-0 kubenswrapper[29252]: I1203 20:19:29.685901 29252 scope.go:117] "RemoveContainer" containerID="47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8" Dec 03 20:19:29.686355 master-0 kubenswrapper[29252]: E1203 20:19:29.686275 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8\": container with ID starting with 47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8 not found: ID does not exist" containerID="47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8" Dec 03 20:19:29.686355 master-0 kubenswrapper[29252]: I1203 20:19:29.686303 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8"} err="failed to get container status \"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8\": rpc error: code = NotFound desc = could not find container \"47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8\": container with ID starting with 47afacca7724c53c777fefb8b39533489fb513ae1aa95777d683017f587ed0d8 not found: ID does not exist" Dec 03 20:19:29.686355 master-0 kubenswrapper[29252]: I1203 20:19:29.686319 29252 scope.go:117] "RemoveContainer" containerID="aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19" Dec 03 20:19:29.686881 master-0 kubenswrapper[29252]: E1203 20:19:29.686840 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19\": container with ID starting with aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19 not found: ID does not exist" containerID="aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19" Dec 03 20:19:29.686959 master-0 kubenswrapper[29252]: I1203 20:19:29.686888 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19"} err="failed to get container status \"aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19\": rpc error: code = NotFound desc = could not find container \"aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19\": container with ID starting with aa17aa9b9b9e4a76185c997cad9fbadf5e8dcb7893ae87af3ab39d73fc5c6e19 not found: ID does not exist" Dec 03 20:19:30.978983 master-0 kubenswrapper[29252]: I1203 20:19:30.978903 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:30.978983 master-0 kubenswrapper[29252]: I1203 20:19:30.978998 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:30.994239 master-0 kubenswrapper[29252]: I1203 20:19:30.994188 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:31.431095 master-0 kubenswrapper[29252]: I1203 20:19:31.431009 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" path="/var/lib/kubelet/pods/1845932f-38ee-42d4-b7ed-9598a45f5529/volumes" Dec 03 20:19:31.638002 master-0 kubenswrapper[29252]: I1203 20:19:31.637918 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-58f4c9b998-p4zp5" Dec 03 20:19:32.487193 master-0 kubenswrapper[29252]: I1203 20:19:32.487127 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:32.487193 master-0 kubenswrapper[29252]: I1203 20:19:32.487192 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:32.530854 master-0 kubenswrapper[29252]: I1203 20:19:32.530802 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:34.731758 master-0 kubenswrapper[29252]: I1203 20:19:34.731688 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: E1203 20:19:34.731994 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="registry-server" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: I1203 20:19:34.732012 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="registry-server" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: E1203 20:19:34.732048 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="extract-utilities" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: I1203 20:19:34.732062 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="extract-utilities" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: E1203 20:19:34.732112 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="extract-content" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: I1203 20:19:34.732124 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="extract-content" Dec 03 20:19:34.732935 master-0 kubenswrapper[29252]: I1203 20:19:34.732336 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="1845932f-38ee-42d4-b7ed-9598a45f5529" containerName="registry-server" Dec 03 20:19:34.733562 master-0 kubenswrapper[29252]: I1203 20:19:34.733522 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.773894 master-0 kubenswrapper[29252]: I1203 20:19:34.742111 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:34.795746 master-0 kubenswrapper[29252]: I1203 20:19:34.795699 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.795920 master-0 kubenswrapper[29252]: I1203 20:19:34.795783 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpk98\" (UniqueName: \"kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.795920 master-0 kubenswrapper[29252]: I1203 20:19:34.795828 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.897414 master-0 kubenswrapper[29252]: I1203 20:19:34.897354 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.897635 master-0 kubenswrapper[29252]: I1203 20:19:34.897484 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.897635 master-0 kubenswrapper[29252]: I1203 20:19:34.897543 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpk98\" (UniqueName: \"kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.898035 master-0 kubenswrapper[29252]: I1203 20:19:34.897990 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.898119 master-0 kubenswrapper[29252]: I1203 20:19:34.898038 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:34.915755 master-0 kubenswrapper[29252]: I1203 20:19:34.915711 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpk98\" (UniqueName: \"kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98\") pod \"certified-operators-fl8pk\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:35.098286 master-0 kubenswrapper[29252]: I1203 20:19:35.098152 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:35.520846 master-0 kubenswrapper[29252]: I1203 20:19:35.518736 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:35.666365 master-0 kubenswrapper[29252]: I1203 20:19:35.666309 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerStarted","Data":"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818"} Dec 03 20:19:35.666365 master-0 kubenswrapper[29252]: I1203 20:19:35.666359 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerStarted","Data":"7a6cef2ce4d8ac4cabf8ced4971aab916444d9b11c709e7c25d519a572e10391"} Dec 03 20:19:36.677476 master-0 kubenswrapper[29252]: I1203 20:19:36.677388 29252 generic.go:334] "Generic (PLEG): container finished" podID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerID="a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818" exitCode=0 Dec 03 20:19:36.677476 master-0 kubenswrapper[29252]: I1203 20:19:36.677439 29252 generic.go:334] "Generic (PLEG): container finished" podID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerID="aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7" exitCode=0 Dec 03 20:19:36.677476 master-0 kubenswrapper[29252]: I1203 20:19:36.677468 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerDied","Data":"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818"} Dec 03 20:19:36.678091 master-0 kubenswrapper[29252]: I1203 20:19:36.677502 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerDied","Data":"aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7"} Dec 03 20:19:37.686053 master-0 kubenswrapper[29252]: I1203 20:19:37.686008 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerStarted","Data":"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c"} Dec 03 20:19:37.713272 master-0 kubenswrapper[29252]: I1203 20:19:37.713180 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fl8pk" podStartSLOduration=2.316404289 podStartE2EDuration="3.713162537s" podCreationTimestamp="2025-12-03 20:19:34 +0000 UTC" firstStartedPulling="2025-12-03 20:19:35.667829406 +0000 UTC m=+610.481374359" lastFinishedPulling="2025-12-03 20:19:37.064587654 +0000 UTC m=+611.878132607" observedRunningTime="2025-12-03 20:19:37.710982883 +0000 UTC m=+612.524527856" watchObservedRunningTime="2025-12-03 20:19:37.713162537 +0000 UTC m=+612.526707480" Dec 03 20:19:42.528149 master-0 kubenswrapper[29252]: I1203 20:19:42.528087 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:42.836940 master-0 kubenswrapper[29252]: I1203 20:19:42.836654 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:42.837214 master-0 kubenswrapper[29252]: I1203 20:19:42.837036 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6c687" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="registry-server" containerID="cri-o://3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8" gracePeriod=2 Dec 03 20:19:43.259134 master-0 kubenswrapper[29252]: I1203 20:19:43.259066 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:43.335270 master-0 kubenswrapper[29252]: I1203 20:19:43.335202 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities\") pod \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " Dec 03 20:19:43.335481 master-0 kubenswrapper[29252]: I1203 20:19:43.335371 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwkx\" (UniqueName: \"kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx\") pod \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " Dec 03 20:19:43.335481 master-0 kubenswrapper[29252]: I1203 20:19:43.335461 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content\") pod \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\" (UID: \"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2\") " Dec 03 20:19:43.336833 master-0 kubenswrapper[29252]: I1203 20:19:43.336354 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities" (OuterVolumeSpecName: "utilities") pod "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" (UID: "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:43.340934 master-0 kubenswrapper[29252]: I1203 20:19:43.339035 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx" (OuterVolumeSpecName: "kube-api-access-8wwkx") pod "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" (UID: "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2"). InnerVolumeSpecName "kube-api-access-8wwkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:19:43.364392 master-0 kubenswrapper[29252]: I1203 20:19:43.364332 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" (UID: "74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:43.438176 master-0 kubenswrapper[29252]: I1203 20:19:43.438099 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:43.438176 master-0 kubenswrapper[29252]: I1203 20:19:43.438165 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:43.438460 master-0 kubenswrapper[29252]: I1203 20:19:43.438186 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwkx\" (UniqueName: \"kubernetes.io/projected/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2-kube-api-access-8wwkx\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:43.753951 master-0 kubenswrapper[29252]: I1203 20:19:43.753862 29252 generic.go:334] "Generic (PLEG): container finished" podID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerID="3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8" exitCode=0 Dec 03 20:19:43.754938 master-0 kubenswrapper[29252]: I1203 20:19:43.753941 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerDied","Data":"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8"} Dec 03 20:19:43.754938 master-0 kubenswrapper[29252]: I1203 20:19:43.754000 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6c687" Dec 03 20:19:43.754938 master-0 kubenswrapper[29252]: I1203 20:19:43.754034 29252 scope.go:117] "RemoveContainer" containerID="3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8" Dec 03 20:19:43.754938 master-0 kubenswrapper[29252]: I1203 20:19:43.754013 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6c687" event={"ID":"74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2","Type":"ContainerDied","Data":"55ff296616c46221be3071d2633c58f56e5fd9627efa8e739d9d1126363a5a4a"} Dec 03 20:19:43.792799 master-0 kubenswrapper[29252]: I1203 20:19:43.792706 29252 scope.go:117] "RemoveContainer" containerID="7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7" Dec 03 20:19:43.798101 master-0 kubenswrapper[29252]: I1203 20:19:43.798022 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:43.812663 master-0 kubenswrapper[29252]: I1203 20:19:43.812569 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6c687"] Dec 03 20:19:43.828612 master-0 kubenswrapper[29252]: I1203 20:19:43.828533 29252 scope.go:117] "RemoveContainer" containerID="1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64" Dec 03 20:19:43.872448 master-0 kubenswrapper[29252]: I1203 20:19:43.871752 29252 scope.go:117] "RemoveContainer" containerID="3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8" Dec 03 20:19:43.872621 master-0 kubenswrapper[29252]: E1203 20:19:43.872526 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8\": container with ID starting with 3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8 not found: ID does not exist" containerID="3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8" Dec 03 20:19:43.872621 master-0 kubenswrapper[29252]: I1203 20:19:43.872587 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8"} err="failed to get container status \"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8\": rpc error: code = NotFound desc = could not find container \"3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8\": container with ID starting with 3e4c5751eb35951e722f2f32236fb323f7ee49e48162c596e4b44c7a455264f8 not found: ID does not exist" Dec 03 20:19:43.872621 master-0 kubenswrapper[29252]: I1203 20:19:43.872625 29252 scope.go:117] "RemoveContainer" containerID="7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7" Dec 03 20:19:43.873210 master-0 kubenswrapper[29252]: E1203 20:19:43.873126 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7\": container with ID starting with 7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7 not found: ID does not exist" containerID="7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7" Dec 03 20:19:43.873210 master-0 kubenswrapper[29252]: I1203 20:19:43.873189 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7"} err="failed to get container status \"7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7\": rpc error: code = NotFound desc = could not find container \"7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7\": container with ID starting with 7fac9fad282983c9135db958a72c15814ff240e5c69e0b50dee5cbc1b569a1e7 not found: ID does not exist" Dec 03 20:19:43.873493 master-0 kubenswrapper[29252]: I1203 20:19:43.873233 29252 scope.go:117] "RemoveContainer" containerID="1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64" Dec 03 20:19:43.873644 master-0 kubenswrapper[29252]: E1203 20:19:43.873606 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64\": container with ID starting with 1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64 not found: ID does not exist" containerID="1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64" Dec 03 20:19:43.873730 master-0 kubenswrapper[29252]: I1203 20:19:43.873640 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64"} err="failed to get container status \"1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64\": rpc error: code = NotFound desc = could not find container \"1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64\": container with ID starting with 1513a921962198c14d5ab9480cd3bf4a531e18225a2b615025e36eab3d5f2e64 not found: ID does not exist" Dec 03 20:19:45.098862 master-0 kubenswrapper[29252]: I1203 20:19:45.098752 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:45.098862 master-0 kubenswrapper[29252]: I1203 20:19:45.098844 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:45.139347 master-0 kubenswrapper[29252]: I1203 20:19:45.139280 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:45.424382 master-0 kubenswrapper[29252]: I1203 20:19:45.424299 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" path="/var/lib/kubelet/pods/74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2/volumes" Dec 03 20:19:45.846032 master-0 kubenswrapper[29252]: I1203 20:19:45.845843 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:47.335212 master-0 kubenswrapper[29252]: I1203 20:19:47.335142 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:47.794365 master-0 kubenswrapper[29252]: I1203 20:19:47.794247 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fl8pk" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="registry-server" containerID="cri-o://6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c" gracePeriod=2 Dec 03 20:19:48.311338 master-0 kubenswrapper[29252]: I1203 20:19:48.311253 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:48.430536 master-0 kubenswrapper[29252]: I1203 20:19:48.430446 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content\") pod \"8db9b87d-5b22-4e8d-8d28-810fc331512f\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " Dec 03 20:19:48.431508 master-0 kubenswrapper[29252]: I1203 20:19:48.430755 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities\") pod \"8db9b87d-5b22-4e8d-8d28-810fc331512f\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " Dec 03 20:19:48.431508 master-0 kubenswrapper[29252]: I1203 20:19:48.430825 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpk98\" (UniqueName: \"kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98\") pod \"8db9b87d-5b22-4e8d-8d28-810fc331512f\" (UID: \"8db9b87d-5b22-4e8d-8d28-810fc331512f\") " Dec 03 20:19:48.431936 master-0 kubenswrapper[29252]: I1203 20:19:48.431866 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities" (OuterVolumeSpecName: "utilities") pod "8db9b87d-5b22-4e8d-8d28-810fc331512f" (UID: "8db9b87d-5b22-4e8d-8d28-810fc331512f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:48.435333 master-0 kubenswrapper[29252]: I1203 20:19:48.435266 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98" (OuterVolumeSpecName: "kube-api-access-bpk98") pod "8db9b87d-5b22-4e8d-8d28-810fc331512f" (UID: "8db9b87d-5b22-4e8d-8d28-810fc331512f"). InnerVolumeSpecName "kube-api-access-bpk98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:19:48.476837 master-0 kubenswrapper[29252]: I1203 20:19:48.476726 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8db9b87d-5b22-4e8d-8d28-810fc331512f" (UID: "8db9b87d-5b22-4e8d-8d28-810fc331512f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:19:48.533222 master-0 kubenswrapper[29252]: I1203 20:19:48.533037 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:48.533222 master-0 kubenswrapper[29252]: I1203 20:19:48.533119 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpk98\" (UniqueName: \"kubernetes.io/projected/8db9b87d-5b22-4e8d-8d28-810fc331512f-kube-api-access-bpk98\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:48.533222 master-0 kubenswrapper[29252]: I1203 20:19:48.533137 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db9b87d-5b22-4e8d-8d28-810fc331512f-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:19:48.815905 master-0 kubenswrapper[29252]: I1203 20:19:48.815742 29252 generic.go:334] "Generic (PLEG): container finished" podID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerID="6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c" exitCode=0 Dec 03 20:19:48.815905 master-0 kubenswrapper[29252]: I1203 20:19:48.815817 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerDied","Data":"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c"} Dec 03 20:19:48.815905 master-0 kubenswrapper[29252]: I1203 20:19:48.815849 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fl8pk" event={"ID":"8db9b87d-5b22-4e8d-8d28-810fc331512f","Type":"ContainerDied","Data":"7a6cef2ce4d8ac4cabf8ced4971aab916444d9b11c709e7c25d519a572e10391"} Dec 03 20:19:48.815905 master-0 kubenswrapper[29252]: I1203 20:19:48.815870 29252 scope.go:117] "RemoveContainer" containerID="6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c" Dec 03 20:19:48.815905 master-0 kubenswrapper[29252]: I1203 20:19:48.815925 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fl8pk" Dec 03 20:19:48.848634 master-0 kubenswrapper[29252]: I1203 20:19:48.848600 29252 scope.go:117] "RemoveContainer" containerID="aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7" Dec 03 20:19:48.869693 master-0 kubenswrapper[29252]: I1203 20:19:48.869596 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:48.878014 master-0 kubenswrapper[29252]: I1203 20:19:48.877929 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fl8pk"] Dec 03 20:19:48.889483 master-0 kubenswrapper[29252]: I1203 20:19:48.889423 29252 scope.go:117] "RemoveContainer" containerID="a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818" Dec 03 20:19:48.909561 master-0 kubenswrapper[29252]: I1203 20:19:48.909508 29252 scope.go:117] "RemoveContainer" containerID="6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c" Dec 03 20:19:48.910016 master-0 kubenswrapper[29252]: E1203 20:19:48.909973 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c\": container with ID starting with 6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c not found: ID does not exist" containerID="6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c" Dec 03 20:19:48.910136 master-0 kubenswrapper[29252]: I1203 20:19:48.910015 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c"} err="failed to get container status \"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c\": rpc error: code = NotFound desc = could not find container \"6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c\": container with ID starting with 6bb8902076ffec1dd48d3b6ec20b934db1351df9292fc82d9c1d55716d65d31c not found: ID does not exist" Dec 03 20:19:48.910136 master-0 kubenswrapper[29252]: I1203 20:19:48.910064 29252 scope.go:117] "RemoveContainer" containerID="aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7" Dec 03 20:19:48.910382 master-0 kubenswrapper[29252]: E1203 20:19:48.910358 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7\": container with ID starting with aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7 not found: ID does not exist" containerID="aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7" Dec 03 20:19:48.910382 master-0 kubenswrapper[29252]: I1203 20:19:48.910378 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7"} err="failed to get container status \"aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7\": rpc error: code = NotFound desc = could not find container \"aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7\": container with ID starting with aac34a06927b521d9a14fe00732210266935ff302ebe1aeaed28d935944c52d7 not found: ID does not exist" Dec 03 20:19:48.910536 master-0 kubenswrapper[29252]: I1203 20:19:48.910389 29252 scope.go:117] "RemoveContainer" containerID="a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818" Dec 03 20:19:48.910943 master-0 kubenswrapper[29252]: E1203 20:19:48.910897 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818\": container with ID starting with a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818 not found: ID does not exist" containerID="a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818" Dec 03 20:19:48.911066 master-0 kubenswrapper[29252]: I1203 20:19:48.910950 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818"} err="failed to get container status \"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818\": rpc error: code = NotFound desc = could not find container \"a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818\": container with ID starting with a497936c872309d56e83458e014d0518267ab517e47fe4c6f6acef8c3bc34818 not found: ID does not exist" Dec 03 20:19:49.426029 master-0 kubenswrapper[29252]: I1203 20:19:49.425964 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" path="/var/lib/kubelet/pods/8db9b87d-5b22-4e8d-8d28-810fc331512f/volumes" Dec 03 20:20:00.912383 master-0 kubenswrapper[29252]: I1203 20:20:00.912229 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912757 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="extract-content" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.912816 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="extract-content" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912840 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="extract-utilities" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.912853 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="extract-utilities" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912884 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="registry-server" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.912897 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="registry-server" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912910 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="extract-content" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.912922 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="extract-content" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912964 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="extract-utilities" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.912976 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="extract-utilities" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: E1203 20:20:00.912997 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="registry-server" Dec 03 20:20:00.913115 master-0 kubenswrapper[29252]: I1203 20:20:00.913009 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="registry-server" Dec 03 20:20:00.913508 master-0 kubenswrapper[29252]: I1203 20:20:00.913239 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db9b87d-5b22-4e8d-8d28-810fc331512f" containerName="registry-server" Dec 03 20:20:00.913508 master-0 kubenswrapper[29252]: I1203 20:20:00.913302 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bed1c0-b8ab-460a-ad2c-3eb5726bbbd2" containerName="registry-server" Dec 03 20:20:00.922835 master-0 kubenswrapper[29252]: I1203 20:20:00.922769 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:00.922835 master-0 kubenswrapper[29252]: I1203 20:20:00.922808 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.068457 master-0 kubenswrapper[29252]: I1203 20:20:01.068371 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9rc\" (UniqueName: \"kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.068763 master-0 kubenswrapper[29252]: I1203 20:20:01.068682 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.068962 master-0 kubenswrapper[29252]: I1203 20:20:01.068930 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.170750 master-0 kubenswrapper[29252]: I1203 20:20:01.170614 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.170750 master-0 kubenswrapper[29252]: I1203 20:20:01.170677 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.170750 master-0 kubenswrapper[29252]: I1203 20:20:01.170721 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9rc\" (UniqueName: \"kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.171230 master-0 kubenswrapper[29252]: I1203 20:20:01.171184 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.171341 master-0 kubenswrapper[29252]: I1203 20:20:01.171301 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.187211 master-0 kubenswrapper[29252]: I1203 20:20:01.187151 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9rc\" (UniqueName: \"kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc\") pod \"redhat-operators-4czb7\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.248049 master-0 kubenswrapper[29252]: I1203 20:20:01.247966 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:01.704900 master-0 kubenswrapper[29252]: I1203 20:20:01.701357 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:01.707386 master-0 kubenswrapper[29252]: W1203 20:20:01.707332 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c881aa_d427_4a34_be1f_fab641887b67.slice/crio-e85a74be12749954f7af314ff2faf5432d04bc215e6d97a5dc96939158be55d7 WatchSource:0}: Error finding container e85a74be12749954f7af314ff2faf5432d04bc215e6d97a5dc96939158be55d7: Status 404 returned error can't find the container with id e85a74be12749954f7af314ff2faf5432d04bc215e6d97a5dc96939158be55d7 Dec 03 20:20:01.928275 master-0 kubenswrapper[29252]: I1203 20:20:01.928216 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerStarted","Data":"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18"} Dec 03 20:20:01.928275 master-0 kubenswrapper[29252]: I1203 20:20:01.928277 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerStarted","Data":"e85a74be12749954f7af314ff2faf5432d04bc215e6d97a5dc96939158be55d7"} Dec 03 20:20:02.939866 master-0 kubenswrapper[29252]: I1203 20:20:02.939752 29252 generic.go:334] "Generic (PLEG): container finished" podID="b1c881aa-d427-4a34-be1f-fab641887b67" containerID="94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18" exitCode=0 Dec 03 20:20:02.939866 master-0 kubenswrapper[29252]: I1203 20:20:02.939863 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerDied","Data":"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18"} Dec 03 20:20:03.358701 master-0 kubenswrapper[29252]: I1203 20:20:03.358611 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54"] Dec 03 20:20:03.361300 master-0 kubenswrapper[29252]: I1203 20:20:03.361245 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.374270 master-0 kubenswrapper[29252]: I1203 20:20:03.374221 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54"] Dec 03 20:20:03.509126 master-0 kubenswrapper[29252]: I1203 20:20:03.509019 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.509271 master-0 kubenswrapper[29252]: I1203 20:20:03.509218 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hj7w\" (UniqueName: \"kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.509548 master-0 kubenswrapper[29252]: I1203 20:20:03.509486 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.611225 master-0 kubenswrapper[29252]: I1203 20:20:03.611097 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hj7w\" (UniqueName: \"kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.611480 master-0 kubenswrapper[29252]: I1203 20:20:03.611231 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.611480 master-0 kubenswrapper[29252]: I1203 20:20:03.611308 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.611890 master-0 kubenswrapper[29252]: I1203 20:20:03.611869 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.613300 master-0 kubenswrapper[29252]: I1203 20:20:03.613174 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.634451 master-0 kubenswrapper[29252]: I1203 20:20:03.634391 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hj7w\" (UniqueName: \"kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.679798 master-0 kubenswrapper[29252]: I1203 20:20:03.679707 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:03.951609 master-0 kubenswrapper[29252]: I1203 20:20:03.951465 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerStarted","Data":"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130"} Dec 03 20:20:04.179603 master-0 kubenswrapper[29252]: I1203 20:20:04.179536 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54"] Dec 03 20:20:04.181085 master-0 kubenswrapper[29252]: W1203 20:20:04.181001 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4eb2808_07ac_4b8e_8094_c488807c9a25.slice/crio-7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb WatchSource:0}: Error finding container 7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb: Status 404 returned error can't find the container with id 7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb Dec 03 20:20:04.962461 master-0 kubenswrapper[29252]: I1203 20:20:04.962351 29252 generic.go:334] "Generic (PLEG): container finished" podID="b1c881aa-d427-4a34-be1f-fab641887b67" containerID="1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130" exitCode=0 Dec 03 20:20:04.963206 master-0 kubenswrapper[29252]: I1203 20:20:04.962485 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerDied","Data":"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130"} Dec 03 20:20:04.964836 master-0 kubenswrapper[29252]: I1203 20:20:04.964381 29252 generic.go:334] "Generic (PLEG): container finished" podID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerID="9a7d35ad8a260e954d6b2bcbfac601e5e5eebccc2a4c8127d315925f168aeca7" exitCode=0 Dec 03 20:20:04.964836 master-0 kubenswrapper[29252]: I1203 20:20:04.964438 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" event={"ID":"a4eb2808-07ac-4b8e-8094-c488807c9a25","Type":"ContainerDied","Data":"9a7d35ad8a260e954d6b2bcbfac601e5e5eebccc2a4c8127d315925f168aeca7"} Dec 03 20:20:04.964836 master-0 kubenswrapper[29252]: I1203 20:20:04.964478 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" event={"ID":"a4eb2808-07ac-4b8e-8094-c488807c9a25","Type":"ContainerStarted","Data":"7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb"} Dec 03 20:20:05.983507 master-0 kubenswrapper[29252]: I1203 20:20:05.983387 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerStarted","Data":"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d"} Dec 03 20:20:06.013479 master-0 kubenswrapper[29252]: I1203 20:20:06.013322 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4czb7" podStartSLOduration=3.547291944 podStartE2EDuration="6.01328758s" podCreationTimestamp="2025-12-03 20:20:00 +0000 UTC" firstStartedPulling="2025-12-03 20:20:02.941899296 +0000 UTC m=+637.755444269" lastFinishedPulling="2025-12-03 20:20:05.407894912 +0000 UTC m=+640.221439905" observedRunningTime="2025-12-03 20:20:06.009392374 +0000 UTC m=+640.822937337" watchObservedRunningTime="2025-12-03 20:20:06.01328758 +0000 UTC m=+640.826832543" Dec 03 20:20:06.992683 master-0 kubenswrapper[29252]: I1203 20:20:06.992628 29252 generic.go:334] "Generic (PLEG): container finished" podID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerID="740de3a0fd7073f54d23f407a3d977af5197b563977ff2715b6a71c6b016727f" exitCode=0 Dec 03 20:20:06.993319 master-0 kubenswrapper[29252]: I1203 20:20:06.992681 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" event={"ID":"a4eb2808-07ac-4b8e-8094-c488807c9a25","Type":"ContainerDied","Data":"740de3a0fd7073f54d23f407a3d977af5197b563977ff2715b6a71c6b016727f"} Dec 03 20:20:08.003147 master-0 kubenswrapper[29252]: I1203 20:20:08.003084 29252 generic.go:334] "Generic (PLEG): container finished" podID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerID="d4764504cbfd08b10fb9ed2f79709de1fdf673886f0e6c1f0be83b0b88eb3655" exitCode=0 Dec 03 20:20:08.003748 master-0 kubenswrapper[29252]: I1203 20:20:08.003150 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" event={"ID":"a4eb2808-07ac-4b8e-8094-c488807c9a25","Type":"ContainerDied","Data":"d4764504cbfd08b10fb9ed2f79709de1fdf673886f0e6c1f0be83b0b88eb3655"} Dec 03 20:20:09.427450 master-0 kubenswrapper[29252]: I1203 20:20:09.427398 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:09.609717 master-0 kubenswrapper[29252]: I1203 20:20:09.609613 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle\") pod \"a4eb2808-07ac-4b8e-8094-c488807c9a25\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " Dec 03 20:20:09.610023 master-0 kubenswrapper[29252]: I1203 20:20:09.609809 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util\") pod \"a4eb2808-07ac-4b8e-8094-c488807c9a25\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " Dec 03 20:20:09.610023 master-0 kubenswrapper[29252]: I1203 20:20:09.609870 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hj7w\" (UniqueName: \"kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w\") pod \"a4eb2808-07ac-4b8e-8094-c488807c9a25\" (UID: \"a4eb2808-07ac-4b8e-8094-c488807c9a25\") " Dec 03 20:20:09.611952 master-0 kubenswrapper[29252]: I1203 20:20:09.611891 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle" (OuterVolumeSpecName: "bundle") pod "a4eb2808-07ac-4b8e-8094-c488807c9a25" (UID: "a4eb2808-07ac-4b8e-8094-c488807c9a25"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:09.613361 master-0 kubenswrapper[29252]: I1203 20:20:09.613307 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w" (OuterVolumeSpecName: "kube-api-access-6hj7w") pod "a4eb2808-07ac-4b8e-8094-c488807c9a25" (UID: "a4eb2808-07ac-4b8e-8094-c488807c9a25"). InnerVolumeSpecName "kube-api-access-6hj7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:09.643259 master-0 kubenswrapper[29252]: I1203 20:20:09.643188 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util" (OuterVolumeSpecName: "util") pod "a4eb2808-07ac-4b8e-8094-c488807c9a25" (UID: "a4eb2808-07ac-4b8e-8094-c488807c9a25"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:09.711437 master-0 kubenswrapper[29252]: I1203 20:20:09.711367 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:09.711437 master-0 kubenswrapper[29252]: I1203 20:20:09.711410 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hj7w\" (UniqueName: \"kubernetes.io/projected/a4eb2808-07ac-4b8e-8094-c488807c9a25-kube-api-access-6hj7w\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:09.711437 master-0 kubenswrapper[29252]: I1203 20:20:09.711420 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4eb2808-07ac-4b8e-8094-c488807c9a25-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:10.025352 master-0 kubenswrapper[29252]: I1203 20:20:10.025164 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" event={"ID":"a4eb2808-07ac-4b8e-8094-c488807c9a25","Type":"ContainerDied","Data":"7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb"} Dec 03 20:20:10.025352 master-0 kubenswrapper[29252]: I1203 20:20:10.025249 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc4176afab157a7bbfb9a8a7cf0b2e60e049805a083cf5948e1c8efd926cbdb" Dec 03 20:20:10.025352 master-0 kubenswrapper[29252]: I1203 20:20:10.025270 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4xmv54" Dec 03 20:20:11.248581 master-0 kubenswrapper[29252]: I1203 20:20:11.248512 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:11.248581 master-0 kubenswrapper[29252]: I1203 20:20:11.248580 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:12.292437 master-0 kubenswrapper[29252]: I1203 20:20:12.292336 29252 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4czb7" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="registry-server" probeResult="failure" output=< Dec 03 20:20:12.292437 master-0 kubenswrapper[29252]: timeout: failed to connect service ":50051" within 1s Dec 03 20:20:12.292437 master-0 kubenswrapper[29252]: > Dec 03 20:20:16.764366 master-0 kubenswrapper[29252]: I1203 20:20:16.764313 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-7d96b77997-8s46z"] Dec 03 20:20:16.765405 master-0 kubenswrapper[29252]: E1203 20:20:16.765387 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="util" Dec 03 20:20:16.765483 master-0 kubenswrapper[29252]: I1203 20:20:16.765473 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="util" Dec 03 20:20:16.765559 master-0 kubenswrapper[29252]: E1203 20:20:16.765549 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="pull" Dec 03 20:20:16.765618 master-0 kubenswrapper[29252]: I1203 20:20:16.765609 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="pull" Dec 03 20:20:16.765689 master-0 kubenswrapper[29252]: E1203 20:20:16.765680 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="extract" Dec 03 20:20:16.765745 master-0 kubenswrapper[29252]: I1203 20:20:16.765736 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="extract" Dec 03 20:20:16.765954 master-0 kubenswrapper[29252]: I1203 20:20:16.765942 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb2808-07ac-4b8e-8094-c488807c9a25" containerName="extract" Dec 03 20:20:16.766505 master-0 kubenswrapper[29252]: I1203 20:20:16.766489 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:16.769091 master-0 kubenswrapper[29252]: I1203 20:20:16.768818 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Dec 03 20:20:16.769091 master-0 kubenswrapper[29252]: I1203 20:20:16.768930 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Dec 03 20:20:16.769262 master-0 kubenswrapper[29252]: I1203 20:20:16.769105 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Dec 03 20:20:16.769565 master-0 kubenswrapper[29252]: I1203 20:20:16.769524 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Dec 03 20:20:16.769829 master-0 kubenswrapper[29252]: I1203 20:20:16.769809 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Dec 03 20:20:16.788563 master-0 kubenswrapper[29252]: I1203 20:20:16.788512 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7d96b77997-8s46z"] Dec 03 20:20:16.947119 master-0 kubenswrapper[29252]: I1203 20:20:16.947046 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zqx\" (UniqueName: \"kubernetes.io/projected/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-kube-api-access-66zqx\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:16.947368 master-0 kubenswrapper[29252]: I1203 20:20:16.947146 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-apiservice-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:16.947368 master-0 kubenswrapper[29252]: I1203 20:20:16.947184 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-socket-dir\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:16.947368 master-0 kubenswrapper[29252]: I1203 20:20:16.947234 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-webhook-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:16.947368 master-0 kubenswrapper[29252]: I1203 20:20:16.947259 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-metrics-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.049553 master-0 kubenswrapper[29252]: I1203 20:20:17.049423 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zqx\" (UniqueName: \"kubernetes.io/projected/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-kube-api-access-66zqx\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.049731 master-0 kubenswrapper[29252]: I1203 20:20:17.049607 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-apiservice-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.049731 master-0 kubenswrapper[29252]: I1203 20:20:17.049685 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-socket-dir\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.049894 master-0 kubenswrapper[29252]: I1203 20:20:17.049857 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-webhook-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.049959 master-0 kubenswrapper[29252]: I1203 20:20:17.049910 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-metrics-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.050361 master-0 kubenswrapper[29252]: I1203 20:20:17.050314 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-socket-dir\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.053466 master-0 kubenswrapper[29252]: I1203 20:20:17.053422 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-webhook-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.053765 master-0 kubenswrapper[29252]: I1203 20:20:17.053717 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-apiservice-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.054123 master-0 kubenswrapper[29252]: I1203 20:20:17.054070 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-metrics-cert\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.082141 master-0 kubenswrapper[29252]: I1203 20:20:17.082093 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zqx\" (UniqueName: \"kubernetes.io/projected/0a5a9bd4-822a-4f9b-b3f0-1b689bc35857-kube-api-access-66zqx\") pod \"lvms-operator-7d96b77997-8s46z\" (UID: \"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857\") " pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.381239 master-0 kubenswrapper[29252]: I1203 20:20:17.381167 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:17.792213 master-0 kubenswrapper[29252]: I1203 20:20:17.792091 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-7d96b77997-8s46z"] Dec 03 20:20:17.796740 master-0 kubenswrapper[29252]: W1203 20:20:17.796548 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a5a9bd4_822a_4f9b_b3f0_1b689bc35857.slice/crio-2f449240b6ab40710c6cd07b7e2359ee2b539523aad42666cab9b69f5e87caef WatchSource:0}: Error finding container 2f449240b6ab40710c6cd07b7e2359ee2b539523aad42666cab9b69f5e87caef: Status 404 returned error can't find the container with id 2f449240b6ab40710c6cd07b7e2359ee2b539523aad42666cab9b69f5e87caef Dec 03 20:20:18.129263 master-0 kubenswrapper[29252]: I1203 20:20:18.129194 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" event={"ID":"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857","Type":"ContainerStarted","Data":"2f449240b6ab40710c6cd07b7e2359ee2b539523aad42666cab9b69f5e87caef"} Dec 03 20:20:21.290163 master-0 kubenswrapper[29252]: I1203 20:20:21.290060 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:21.338901 master-0 kubenswrapper[29252]: I1203 20:20:21.338836 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:23.184459 master-0 kubenswrapper[29252]: I1203 20:20:23.184409 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" event={"ID":"0a5a9bd4-822a-4f9b-b3f0-1b689bc35857","Type":"ContainerStarted","Data":"b38d20c86de79a77d1d586d1e0f36010d7e4bc76f39913b13364bf2c79b8a7c9"} Dec 03 20:20:23.185058 master-0 kubenswrapper[29252]: I1203 20:20:23.184896 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:23.189222 master-0 kubenswrapper[29252]: I1203 20:20:23.189189 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" Dec 03 20:20:23.202660 master-0 kubenswrapper[29252]: I1203 20:20:23.202556 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-7d96b77997-8s46z" podStartSLOduration=2.80115071 podStartE2EDuration="7.202541621s" podCreationTimestamp="2025-12-03 20:20:16 +0000 UTC" firstStartedPulling="2025-12-03 20:20:17.798623028 +0000 UTC m=+652.612167981" lastFinishedPulling="2025-12-03 20:20:22.200013889 +0000 UTC m=+657.013558892" observedRunningTime="2025-12-03 20:20:23.201154297 +0000 UTC m=+658.014699270" watchObservedRunningTime="2025-12-03 20:20:23.202541621 +0000 UTC m=+658.016086584" Dec 03 20:20:23.458157 master-0 kubenswrapper[29252]: I1203 20:20:23.457985 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:23.458539 master-0 kubenswrapper[29252]: I1203 20:20:23.458496 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4czb7" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="registry-server" containerID="cri-o://955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d" gracePeriod=2 Dec 03 20:20:24.014223 master-0 kubenswrapper[29252]: I1203 20:20:24.014164 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:24.033620 master-0 kubenswrapper[29252]: I1203 20:20:24.033560 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content\") pod \"b1c881aa-d427-4a34-be1f-fab641887b67\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " Dec 03 20:20:24.033620 master-0 kubenswrapper[29252]: I1203 20:20:24.033610 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities\") pod \"b1c881aa-d427-4a34-be1f-fab641887b67\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " Dec 03 20:20:24.033875 master-0 kubenswrapper[29252]: I1203 20:20:24.033707 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m9rc\" (UniqueName: \"kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc\") pod \"b1c881aa-d427-4a34-be1f-fab641887b67\" (UID: \"b1c881aa-d427-4a34-be1f-fab641887b67\") " Dec 03 20:20:24.035482 master-0 kubenswrapper[29252]: I1203 20:20:24.035412 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities" (OuterVolumeSpecName: "utilities") pod "b1c881aa-d427-4a34-be1f-fab641887b67" (UID: "b1c881aa-d427-4a34-be1f-fab641887b67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:24.038981 master-0 kubenswrapper[29252]: I1203 20:20:24.038939 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc" (OuterVolumeSpecName: "kube-api-access-7m9rc") pod "b1c881aa-d427-4a34-be1f-fab641887b67" (UID: "b1c881aa-d427-4a34-be1f-fab641887b67"). InnerVolumeSpecName "kube-api-access-7m9rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:24.142811 master-0 kubenswrapper[29252]: I1203 20:20:24.138017 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m9rc\" (UniqueName: \"kubernetes.io/projected/b1c881aa-d427-4a34-be1f-fab641887b67-kube-api-access-7m9rc\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:24.142811 master-0 kubenswrapper[29252]: I1203 20:20:24.138066 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:24.147546 master-0 kubenswrapper[29252]: I1203 20:20:24.144903 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c881aa-d427-4a34-be1f-fab641887b67" (UID: "b1c881aa-d427-4a34-be1f-fab641887b67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:24.194823 master-0 kubenswrapper[29252]: I1203 20:20:24.194604 29252 generic.go:334] "Generic (PLEG): container finished" podID="b1c881aa-d427-4a34-be1f-fab641887b67" containerID="955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d" exitCode=0 Dec 03 20:20:24.195405 master-0 kubenswrapper[29252]: I1203 20:20:24.195088 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czb7" Dec 03 20:20:24.195713 master-0 kubenswrapper[29252]: I1203 20:20:24.195670 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerDied","Data":"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d"} Dec 03 20:20:24.195824 master-0 kubenswrapper[29252]: I1203 20:20:24.195722 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czb7" event={"ID":"b1c881aa-d427-4a34-be1f-fab641887b67","Type":"ContainerDied","Data":"e85a74be12749954f7af314ff2faf5432d04bc215e6d97a5dc96939158be55d7"} Dec 03 20:20:24.195824 master-0 kubenswrapper[29252]: I1203 20:20:24.195751 29252 scope.go:117] "RemoveContainer" containerID="955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d" Dec 03 20:20:24.219043 master-0 kubenswrapper[29252]: I1203 20:20:24.219004 29252 scope.go:117] "RemoveContainer" containerID="1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130" Dec 03 20:20:24.235758 master-0 kubenswrapper[29252]: I1203 20:20:24.235730 29252 scope.go:117] "RemoveContainer" containerID="94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18" Dec 03 20:20:24.239417 master-0 kubenswrapper[29252]: I1203 20:20:24.239374 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c881aa-d427-4a34-be1f-fab641887b67-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:24.284398 master-0 kubenswrapper[29252]: I1203 20:20:24.284349 29252 scope.go:117] "RemoveContainer" containerID="955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d" Dec 03 20:20:24.284902 master-0 kubenswrapper[29252]: E1203 20:20:24.284869 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d\": container with ID starting with 955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d not found: ID does not exist" containerID="955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d" Dec 03 20:20:24.284991 master-0 kubenswrapper[29252]: I1203 20:20:24.284908 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d"} err="failed to get container status \"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d\": rpc error: code = NotFound desc = could not find container \"955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d\": container with ID starting with 955a0357ea85d5579651e2dbaa8e5fa9e5ce554a3fb567ef37fb82145073303d not found: ID does not exist" Dec 03 20:20:24.284991 master-0 kubenswrapper[29252]: I1203 20:20:24.284929 29252 scope.go:117] "RemoveContainer" containerID="1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130" Dec 03 20:20:24.285336 master-0 kubenswrapper[29252]: E1203 20:20:24.285300 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130\": container with ID starting with 1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130 not found: ID does not exist" containerID="1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130" Dec 03 20:20:24.285336 master-0 kubenswrapper[29252]: I1203 20:20:24.285330 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130"} err="failed to get container status \"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130\": rpc error: code = NotFound desc = could not find container \"1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130\": container with ID starting with 1656c3c29c0a8bd48d984115aecd419147780ed58a7870f6e331f7cb4c786130 not found: ID does not exist" Dec 03 20:20:24.285450 master-0 kubenswrapper[29252]: I1203 20:20:24.285343 29252 scope.go:117] "RemoveContainer" containerID="94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18" Dec 03 20:20:24.286417 master-0 kubenswrapper[29252]: E1203 20:20:24.286390 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18\": container with ID starting with 94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18 not found: ID does not exist" containerID="94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18" Dec 03 20:20:24.286500 master-0 kubenswrapper[29252]: I1203 20:20:24.286412 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18"} err="failed to get container status \"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18\": rpc error: code = NotFound desc = could not find container \"94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18\": container with ID starting with 94c021081289b0ff12777306790997dfda4421e7580f74eaf0f39c8312b59b18 not found: ID does not exist" Dec 03 20:20:24.289366 master-0 kubenswrapper[29252]: I1203 20:20:24.289325 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:24.292652 master-0 kubenswrapper[29252]: I1203 20:20:24.292596 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4czb7"] Dec 03 20:20:25.432000 master-0 kubenswrapper[29252]: I1203 20:20:25.431922 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" path="/var/lib/kubelet/pods/b1c881aa-d427-4a34-be1f-fab641887b67/volumes" Dec 03 20:20:27.349482 master-0 kubenswrapper[29252]: I1203 20:20:27.349414 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp"] Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: E1203 20:20:27.349668 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="extract-content" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: I1203 20:20:27.349681 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="extract-content" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: E1203 20:20:27.349700 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="registry-server" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: I1203 20:20:27.349706 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="registry-server" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: E1203 20:20:27.349724 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="extract-utilities" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: I1203 20:20:27.349749 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="extract-utilities" Dec 03 20:20:27.350192 master-0 kubenswrapper[29252]: I1203 20:20:27.349952 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c881aa-d427-4a34-be1f-fab641887b67" containerName="registry-server" Dec 03 20:20:27.351001 master-0 kubenswrapper[29252]: I1203 20:20:27.350965 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.369222 master-0 kubenswrapper[29252]: I1203 20:20:27.369160 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp"] Dec 03 20:20:27.486510 master-0 kubenswrapper[29252]: I1203 20:20:27.486404 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.486873 master-0 kubenswrapper[29252]: I1203 20:20:27.486854 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.488847 master-0 kubenswrapper[29252]: I1203 20:20:27.488555 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8rr\" (UniqueName: \"kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.592221 master-0 kubenswrapper[29252]: I1203 20:20:27.592149 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8rr\" (UniqueName: \"kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.592517 master-0 kubenswrapper[29252]: I1203 20:20:27.592282 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.592517 master-0 kubenswrapper[29252]: I1203 20:20:27.592350 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.593007 master-0 kubenswrapper[29252]: I1203 20:20:27.592968 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.593304 master-0 kubenswrapper[29252]: I1203 20:20:27.593261 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.615174 master-0 kubenswrapper[29252]: I1203 20:20:27.615096 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8rr\" (UniqueName: \"kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.670392 master-0 kubenswrapper[29252]: I1203 20:20:27.670320 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:27.742165 master-0 kubenswrapper[29252]: I1203 20:20:27.742065 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh"] Dec 03 20:20:27.750336 master-0 kubenswrapper[29252]: I1203 20:20:27.744275 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.764647 master-0 kubenswrapper[29252]: I1203 20:20:27.764222 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh"] Dec 03 20:20:27.819091 master-0 kubenswrapper[29252]: I1203 20:20:27.819023 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.819091 master-0 kubenswrapper[29252]: I1203 20:20:27.819078 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2nr\" (UniqueName: \"kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.819412 master-0 kubenswrapper[29252]: I1203 20:20:27.819112 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.921585 master-0 kubenswrapper[29252]: I1203 20:20:27.920960 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.921585 master-0 kubenswrapper[29252]: I1203 20:20:27.921029 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2nr\" (UniqueName: \"kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.921585 master-0 kubenswrapper[29252]: I1203 20:20:27.921072 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.921585 master-0 kubenswrapper[29252]: I1203 20:20:27.921545 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.922633 master-0 kubenswrapper[29252]: I1203 20:20:27.922595 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:27.942869 master-0 kubenswrapper[29252]: I1203 20:20:27.942821 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2nr\" (UniqueName: \"kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:28.094730 master-0 kubenswrapper[29252]: I1203 20:20:28.094665 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:28.127372 master-0 kubenswrapper[29252]: I1203 20:20:28.127327 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp"] Dec 03 20:20:28.133685 master-0 kubenswrapper[29252]: W1203 20:20:28.133520 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48466adb_a4ff_4794_b6e3_ac171a498426.slice/crio-70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7 WatchSource:0}: Error finding container 70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7: Status 404 returned error can't find the container with id 70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7 Dec 03 20:20:28.241879 master-0 kubenswrapper[29252]: I1203 20:20:28.236414 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" event={"ID":"48466adb-a4ff-4794-b6e3-ac171a498426","Type":"ContainerStarted","Data":"70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7"} Dec 03 20:20:28.526422 master-0 kubenswrapper[29252]: I1203 20:20:28.526312 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q"] Dec 03 20:20:28.527792 master-0 kubenswrapper[29252]: I1203 20:20:28.527741 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.548210 master-0 kubenswrapper[29252]: I1203 20:20:28.548161 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh"] Dec 03 20:20:28.548344 master-0 kubenswrapper[29252]: W1203 20:20:28.548297 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189561f3_f21c_4152_bc93_618725d14d4e.slice/crio-94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c WatchSource:0}: Error finding container 94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c: Status 404 returned error can't find the container with id 94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c Dec 03 20:20:28.554551 master-0 kubenswrapper[29252]: I1203 20:20:28.554502 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q"] Dec 03 20:20:28.631608 master-0 kubenswrapper[29252]: I1203 20:20:28.631551 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.631813 master-0 kubenswrapper[29252]: I1203 20:20:28.631650 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.631813 master-0 kubenswrapper[29252]: I1203 20:20:28.631695 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jk8k\" (UniqueName: \"kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.733464 master-0 kubenswrapper[29252]: I1203 20:20:28.733416 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.733642 master-0 kubenswrapper[29252]: I1203 20:20:28.733502 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jk8k\" (UniqueName: \"kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.733642 master-0 kubenswrapper[29252]: I1203 20:20:28.733550 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.734087 master-0 kubenswrapper[29252]: I1203 20:20:28.734044 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.734166 master-0 kubenswrapper[29252]: I1203 20:20:28.734077 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.753518 master-0 kubenswrapper[29252]: I1203 20:20:28.753478 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jk8k\" (UniqueName: \"kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:28.843985 master-0 kubenswrapper[29252]: I1203 20:20:28.842844 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:29.254973 master-0 kubenswrapper[29252]: I1203 20:20:29.254827 29252 generic.go:334] "Generic (PLEG): container finished" podID="48466adb-a4ff-4794-b6e3-ac171a498426" containerID="056679136aaefa727186ed31f5495559b5aa663ace6306cd4c801b627d702ddd" exitCode=0 Dec 03 20:20:29.254973 master-0 kubenswrapper[29252]: I1203 20:20:29.254907 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" event={"ID":"48466adb-a4ff-4794-b6e3-ac171a498426","Type":"ContainerDied","Data":"056679136aaefa727186ed31f5495559b5aa663ace6306cd4c801b627d702ddd"} Dec 03 20:20:29.259344 master-0 kubenswrapper[29252]: I1203 20:20:29.257582 29252 generic.go:334] "Generic (PLEG): container finished" podID="189561f3-f21c-4152-bc93-618725d14d4e" containerID="51690521652bae8224bd821a9b1c02ea61ffef0fd344d084645c6163135753b5" exitCode=0 Dec 03 20:20:29.259344 master-0 kubenswrapper[29252]: I1203 20:20:29.257646 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" event={"ID":"189561f3-f21c-4152-bc93-618725d14d4e","Type":"ContainerDied","Data":"51690521652bae8224bd821a9b1c02ea61ffef0fd344d084645c6163135753b5"} Dec 03 20:20:29.259344 master-0 kubenswrapper[29252]: I1203 20:20:29.257687 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" event={"ID":"189561f3-f21c-4152-bc93-618725d14d4e","Type":"ContainerStarted","Data":"94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c"} Dec 03 20:20:29.279112 master-0 kubenswrapper[29252]: I1203 20:20:29.277395 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q"] Dec 03 20:20:30.278494 master-0 kubenswrapper[29252]: I1203 20:20:30.278404 29252 generic.go:334] "Generic (PLEG): container finished" podID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerID="141e89ef171603cf404a06205f877dab2b4f951e04c3a56611c09eeaebca1dbf" exitCode=0 Dec 03 20:20:30.278494 master-0 kubenswrapper[29252]: I1203 20:20:30.278456 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" event={"ID":"a61efaac-71f4-4842-9e63-bc48ee87fa41","Type":"ContainerDied","Data":"141e89ef171603cf404a06205f877dab2b4f951e04c3a56611c09eeaebca1dbf"} Dec 03 20:20:30.278494 master-0 kubenswrapper[29252]: I1203 20:20:30.278484 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" event={"ID":"a61efaac-71f4-4842-9e63-bc48ee87fa41","Type":"ContainerStarted","Data":"5c0129dee58d7608520e8dd0be65a9046b80c068544f3119e2115eef7ba0801a"} Dec 03 20:20:31.292197 master-0 kubenswrapper[29252]: I1203 20:20:31.292136 29252 generic.go:334] "Generic (PLEG): container finished" podID="189561f3-f21c-4152-bc93-618725d14d4e" containerID="5192b33bf3625a9b485c478a36e21667908aac65c6327eae6c3d782611bc68c6" exitCode=0 Dec 03 20:20:31.292702 master-0 kubenswrapper[29252]: I1203 20:20:31.292216 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" event={"ID":"189561f3-f21c-4152-bc93-618725d14d4e","Type":"ContainerDied","Data":"5192b33bf3625a9b485c478a36e21667908aac65c6327eae6c3d782611bc68c6"} Dec 03 20:20:33.310841 master-0 kubenswrapper[29252]: I1203 20:20:33.310750 29252 generic.go:334] "Generic (PLEG): container finished" podID="189561f3-f21c-4152-bc93-618725d14d4e" containerID="ec8d7222110ff4b9593ecee55add3931d3af9f69bb8867d31ba9a484f159dd8f" exitCode=0 Dec 03 20:20:33.311757 master-0 kubenswrapper[29252]: I1203 20:20:33.310856 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" event={"ID":"189561f3-f21c-4152-bc93-618725d14d4e","Type":"ContainerDied","Data":"ec8d7222110ff4b9593ecee55add3931d3af9f69bb8867d31ba9a484f159dd8f"} Dec 03 20:20:33.314843 master-0 kubenswrapper[29252]: I1203 20:20:33.314722 29252 generic.go:334] "Generic (PLEG): container finished" podID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerID="bdd4b1d5ee89ff6574bb6bfbe29566263df350f5bcf9059bec13c6b595c4d671" exitCode=0 Dec 03 20:20:33.314989 master-0 kubenswrapper[29252]: I1203 20:20:33.314796 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" event={"ID":"a61efaac-71f4-4842-9e63-bc48ee87fa41","Type":"ContainerDied","Data":"bdd4b1d5ee89ff6574bb6bfbe29566263df350f5bcf9059bec13c6b595c4d671"} Dec 03 20:20:33.317753 master-0 kubenswrapper[29252]: I1203 20:20:33.317527 29252 generic.go:334] "Generic (PLEG): container finished" podID="48466adb-a4ff-4794-b6e3-ac171a498426" containerID="9e64d022ac46d83654372718e29a21e24f579030a9d671e7334918707f84e744" exitCode=0 Dec 03 20:20:33.317753 master-0 kubenswrapper[29252]: I1203 20:20:33.317572 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" event={"ID":"48466adb-a4ff-4794-b6e3-ac171a498426","Type":"ContainerDied","Data":"9e64d022ac46d83654372718e29a21e24f579030a9d671e7334918707f84e744"} Dec 03 20:20:34.327806 master-0 kubenswrapper[29252]: I1203 20:20:34.327722 29252 generic.go:334] "Generic (PLEG): container finished" podID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerID="c1c9a4f78a6a07b37931027ee41e8dcc256ca85541f3707dbfad2681b41e6ab1" exitCode=0 Dec 03 20:20:34.327806 master-0 kubenswrapper[29252]: I1203 20:20:34.327802 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" event={"ID":"a61efaac-71f4-4842-9e63-bc48ee87fa41","Type":"ContainerDied","Data":"c1c9a4f78a6a07b37931027ee41e8dcc256ca85541f3707dbfad2681b41e6ab1"} Dec 03 20:20:34.330274 master-0 kubenswrapper[29252]: I1203 20:20:34.330246 29252 generic.go:334] "Generic (PLEG): container finished" podID="48466adb-a4ff-4794-b6e3-ac171a498426" containerID="218e21b83d72be74b93fa46f1cacec9a3087b70c8b4538ade1f338c13c0976cc" exitCode=0 Dec 03 20:20:34.330356 master-0 kubenswrapper[29252]: I1203 20:20:34.330312 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" event={"ID":"48466adb-a4ff-4794-b6e3-ac171a498426","Type":"ContainerDied","Data":"218e21b83d72be74b93fa46f1cacec9a3087b70c8b4538ade1f338c13c0976cc"} Dec 03 20:20:34.707661 master-0 kubenswrapper[29252]: I1203 20:20:34.707619 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:34.836214 master-0 kubenswrapper[29252]: I1203 20:20:34.836167 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util\") pod \"189561f3-f21c-4152-bc93-618725d14d4e\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " Dec 03 20:20:34.836466 master-0 kubenswrapper[29252]: I1203 20:20:34.836453 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle\") pod \"189561f3-f21c-4152-bc93-618725d14d4e\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " Dec 03 20:20:34.836588 master-0 kubenswrapper[29252]: I1203 20:20:34.836572 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2nr\" (UniqueName: \"kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr\") pod \"189561f3-f21c-4152-bc93-618725d14d4e\" (UID: \"189561f3-f21c-4152-bc93-618725d14d4e\") " Dec 03 20:20:34.838964 master-0 kubenswrapper[29252]: I1203 20:20:34.838908 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle" (OuterVolumeSpecName: "bundle") pod "189561f3-f21c-4152-bc93-618725d14d4e" (UID: "189561f3-f21c-4152-bc93-618725d14d4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:34.847163 master-0 kubenswrapper[29252]: I1203 20:20:34.847121 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util" (OuterVolumeSpecName: "util") pod "189561f3-f21c-4152-bc93-618725d14d4e" (UID: "189561f3-f21c-4152-bc93-618725d14d4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:34.852549 master-0 kubenswrapper[29252]: I1203 20:20:34.852493 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr" (OuterVolumeSpecName: "kube-api-access-2z2nr") pod "189561f3-f21c-4152-bc93-618725d14d4e" (UID: "189561f3-f21c-4152-bc93-618725d14d4e"). InnerVolumeSpecName "kube-api-access-2z2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:34.938566 master-0 kubenswrapper[29252]: I1203 20:20:34.938452 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:34.938566 master-0 kubenswrapper[29252]: I1203 20:20:34.938493 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/189561f3-f21c-4152-bc93-618725d14d4e-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:34.938566 master-0 kubenswrapper[29252]: I1203 20:20:34.938508 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z2nr\" (UniqueName: \"kubernetes.io/projected/189561f3-f21c-4152-bc93-618725d14d4e-kube-api-access-2z2nr\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.344437 master-0 kubenswrapper[29252]: I1203 20:20:35.344302 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" event={"ID":"189561f3-f21c-4152-bc93-618725d14d4e","Type":"ContainerDied","Data":"94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c"} Dec 03 20:20:35.344437 master-0 kubenswrapper[29252]: I1203 20:20:35.344365 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94c7000003e92b07bbf8893f1044f7badf79e67b6251d73a4cce193b18b7b31c" Dec 03 20:20:35.344437 master-0 kubenswrapper[29252]: I1203 20:20:35.344408 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83mgckh" Dec 03 20:20:35.785056 master-0 kubenswrapper[29252]: I1203 20:20:35.785026 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:35.789047 master-0 kubenswrapper[29252]: I1203 20:20:35.789027 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:35.857961 master-0 kubenswrapper[29252]: I1203 20:20:35.857862 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util\") pod \"48466adb-a4ff-4794-b6e3-ac171a498426\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " Dec 03 20:20:35.858243 master-0 kubenswrapper[29252]: I1203 20:20:35.857974 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jk8k\" (UniqueName: \"kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k\") pod \"a61efaac-71f4-4842-9e63-bc48ee87fa41\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " Dec 03 20:20:35.858549 master-0 kubenswrapper[29252]: I1203 20:20:35.858511 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t8rr\" (UniqueName: \"kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr\") pod \"48466adb-a4ff-4794-b6e3-ac171a498426\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " Dec 03 20:20:35.858654 master-0 kubenswrapper[29252]: I1203 20:20:35.858557 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util\") pod \"a61efaac-71f4-4842-9e63-bc48ee87fa41\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " Dec 03 20:20:35.858654 master-0 kubenswrapper[29252]: I1203 20:20:35.858606 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle\") pod \"a61efaac-71f4-4842-9e63-bc48ee87fa41\" (UID: \"a61efaac-71f4-4842-9e63-bc48ee87fa41\") " Dec 03 20:20:35.858823 master-0 kubenswrapper[29252]: I1203 20:20:35.858653 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle\") pod \"48466adb-a4ff-4794-b6e3-ac171a498426\" (UID: \"48466adb-a4ff-4794-b6e3-ac171a498426\") " Dec 03 20:20:35.860251 master-0 kubenswrapper[29252]: I1203 20:20:35.860173 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle" (OuterVolumeSpecName: "bundle") pod "a61efaac-71f4-4842-9e63-bc48ee87fa41" (UID: "a61efaac-71f4-4842-9e63-bc48ee87fa41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:35.862312 master-0 kubenswrapper[29252]: I1203 20:20:35.862264 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr" (OuterVolumeSpecName: "kube-api-access-8t8rr") pod "48466adb-a4ff-4794-b6e3-ac171a498426" (UID: "48466adb-a4ff-4794-b6e3-ac171a498426"). InnerVolumeSpecName "kube-api-access-8t8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:35.862665 master-0 kubenswrapper[29252]: I1203 20:20:35.862559 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle" (OuterVolumeSpecName: "bundle") pod "48466adb-a4ff-4794-b6e3-ac171a498426" (UID: "48466adb-a4ff-4794-b6e3-ac171a498426"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:35.864210 master-0 kubenswrapper[29252]: I1203 20:20:35.864024 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k" (OuterVolumeSpecName: "kube-api-access-6jk8k") pod "a61efaac-71f4-4842-9e63-bc48ee87fa41" (UID: "a61efaac-71f4-4842-9e63-bc48ee87fa41"). InnerVolumeSpecName "kube-api-access-6jk8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:35.870141 master-0 kubenswrapper[29252]: I1203 20:20:35.870093 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util" (OuterVolumeSpecName: "util") pod "a61efaac-71f4-4842-9e63-bc48ee87fa41" (UID: "a61efaac-71f4-4842-9e63-bc48ee87fa41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:35.886084 master-0 kubenswrapper[29252]: I1203 20:20:35.886033 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util" (OuterVolumeSpecName: "util") pod "48466adb-a4ff-4794-b6e3-ac171a498426" (UID: "48466adb-a4ff-4794-b6e3-ac171a498426"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:35.960713 master-0 kubenswrapper[29252]: I1203 20:20:35.960458 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t8rr\" (UniqueName: \"kubernetes.io/projected/48466adb-a4ff-4794-b6e3-ac171a498426-kube-api-access-8t8rr\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.960713 master-0 kubenswrapper[29252]: I1203 20:20:35.960715 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.960966 master-0 kubenswrapper[29252]: I1203 20:20:35.960729 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a61efaac-71f4-4842-9e63-bc48ee87fa41-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.960966 master-0 kubenswrapper[29252]: I1203 20:20:35.960743 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.960966 master-0 kubenswrapper[29252]: I1203 20:20:35.960754 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48466adb-a4ff-4794-b6e3-ac171a498426-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:35.960966 master-0 kubenswrapper[29252]: I1203 20:20:35.960765 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jk8k\" (UniqueName: \"kubernetes.io/projected/a61efaac-71f4-4842-9e63-bc48ee87fa41-kube-api-access-6jk8k\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:36.354864 master-0 kubenswrapper[29252]: I1203 20:20:36.354807 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" event={"ID":"48466adb-a4ff-4794-b6e3-ac171a498426","Type":"ContainerDied","Data":"70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7"} Dec 03 20:20:36.354864 master-0 kubenswrapper[29252]: I1203 20:20:36.354857 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70889f7542978ced7c8a4a5a9ac9ae76deca15322fed056e1d9279ba6c2295e7" Dec 03 20:20:36.355400 master-0 kubenswrapper[29252]: I1203 20:20:36.354860 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9wfjp" Dec 03 20:20:36.357513 master-0 kubenswrapper[29252]: I1203 20:20:36.357484 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" event={"ID":"a61efaac-71f4-4842-9e63-bc48ee87fa41","Type":"ContainerDied","Data":"5c0129dee58d7608520e8dd0be65a9046b80c068544f3119e2115eef7ba0801a"} Dec 03 20:20:36.357596 master-0 kubenswrapper[29252]: I1203 20:20:36.357514 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c0129dee58d7608520e8dd0be65a9046b80c068544f3119e2115eef7ba0801a" Dec 03 20:20:36.357596 master-0 kubenswrapper[29252]: I1203 20:20:36.357573 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fmsk4q" Dec 03 20:20:37.329357 master-0 kubenswrapper[29252]: I1203 20:20:37.329269 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4"] Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: E1203 20:20:37.329554 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="util" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: I1203 20:20:37.329569 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="util" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: E1203 20:20:37.329588 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="util" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: I1203 20:20:37.329594 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="util" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: E1203 20:20:37.329602 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="extract" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: I1203 20:20:37.329610 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="extract" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: E1203 20:20:37.329620 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="extract" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: I1203 20:20:37.329625 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="extract" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: E1203 20:20:37.329639 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="pull" Dec 03 20:20:37.329632 master-0 kubenswrapper[29252]: I1203 20:20:37.329645 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="pull" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: E1203 20:20:37.329660 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="pull" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329666 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="pull" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: E1203 20:20:37.329676 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="extract" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329681 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="extract" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: E1203 20:20:37.329690 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="pull" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329695 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="pull" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: E1203 20:20:37.329705 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="util" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329711 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="util" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329851 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61efaac-71f4-4842-9e63-bc48ee87fa41" containerName="extract" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329871 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="48466adb-a4ff-4794-b6e3-ac171a498426" containerName="extract" Dec 03 20:20:37.330183 master-0 kubenswrapper[29252]: I1203 20:20:37.329898 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="189561f3-f21c-4152-bc93-618725d14d4e" containerName="extract" Dec 03 20:20:37.330837 master-0 kubenswrapper[29252]: I1203 20:20:37.330803 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.348191 master-0 kubenswrapper[29252]: I1203 20:20:37.347833 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4"] Dec 03 20:20:37.382770 master-0 kubenswrapper[29252]: I1203 20:20:37.382688 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sk29\" (UniqueName: \"kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.383461 master-0 kubenswrapper[29252]: I1203 20:20:37.382918 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.383461 master-0 kubenswrapper[29252]: I1203 20:20:37.383335 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.485487 master-0 kubenswrapper[29252]: I1203 20:20:37.485395 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.485685 master-0 kubenswrapper[29252]: I1203 20:20:37.485518 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.485758 master-0 kubenswrapper[29252]: I1203 20:20:37.485643 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sk29\" (UniqueName: \"kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.486257 master-0 kubenswrapper[29252]: I1203 20:20:37.486240 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.486409 master-0 kubenswrapper[29252]: I1203 20:20:37.486362 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.503576 master-0 kubenswrapper[29252]: I1203 20:20:37.503508 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sk29\" (UniqueName: \"kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:37.673189 master-0 kubenswrapper[29252]: I1203 20:20:37.673112 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:38.210100 master-0 kubenswrapper[29252]: I1203 20:20:38.209951 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4"] Dec 03 20:20:38.233510 master-0 kubenswrapper[29252]: W1203 20:20:38.233440 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod571f83ee_075b_466a_b0ff_de17b30b76b8.slice/crio-f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77 WatchSource:0}: Error finding container f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77: Status 404 returned error can't find the container with id f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77 Dec 03 20:20:38.396865 master-0 kubenswrapper[29252]: I1203 20:20:38.395927 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerStarted","Data":"f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77"} Dec 03 20:20:39.404670 master-0 kubenswrapper[29252]: I1203 20:20:39.404577 29252 generic.go:334] "Generic (PLEG): container finished" podID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerID="1589108efdc9681c0bee893e0324c5c1f0f20526e89e1f69842088c2f3197552" exitCode=0 Dec 03 20:20:39.404670 master-0 kubenswrapper[29252]: I1203 20:20:39.404663 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerDied","Data":"1589108efdc9681c0bee893e0324c5c1f0f20526e89e1f69842088c2f3197552"} Dec 03 20:20:41.422670 master-0 kubenswrapper[29252]: I1203 20:20:41.422577 29252 generic.go:334] "Generic (PLEG): container finished" podID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerID="c8260fb5fe8afc6b137f5b08163a1b503ba8d1d4d26fd1c24df2b8ddb1cc5a86" exitCode=0 Dec 03 20:20:41.439187 master-0 kubenswrapper[29252]: I1203 20:20:41.438966 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerDied","Data":"c8260fb5fe8afc6b137f5b08163a1b503ba8d1d4d26fd1c24df2b8ddb1cc5a86"} Dec 03 20:20:42.257538 master-0 kubenswrapper[29252]: I1203 20:20:42.257452 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56"] Dec 03 20:20:42.258768 master-0 kubenswrapper[29252]: I1203 20:20:42.258724 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.268194 master-0 kubenswrapper[29252]: I1203 20:20:42.267825 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 03 20:20:42.268194 master-0 kubenswrapper[29252]: I1203 20:20:42.268057 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 03 20:20:42.287052 master-0 kubenswrapper[29252]: I1203 20:20:42.272300 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56"] Dec 03 20:20:42.373946 master-0 kubenswrapper[29252]: I1203 20:20:42.373035 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6j8v\" (UniqueName: \"kubernetes.io/projected/b9117e15-fd90-489d-9112-80d5464923af-kube-api-access-x6j8v\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.373946 master-0 kubenswrapper[29252]: I1203 20:20:42.373196 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9117e15-fd90-489d-9112-80d5464923af-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.447900 master-0 kubenswrapper[29252]: I1203 20:20:42.447742 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerStarted","Data":"b02428d2d3e633ae0767803f4fdbbf217685838bc32bcb76239cb326e23348fb"} Dec 03 20:20:42.531805 master-0 kubenswrapper[29252]: I1203 20:20:42.525361 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" podStartSLOduration=4.548833347 podStartE2EDuration="5.525339294s" podCreationTimestamp="2025-12-03 20:20:37 +0000 UTC" firstStartedPulling="2025-12-03 20:20:39.405946807 +0000 UTC m=+674.219491760" lastFinishedPulling="2025-12-03 20:20:40.382452754 +0000 UTC m=+675.195997707" observedRunningTime="2025-12-03 20:20:42.486838124 +0000 UTC m=+677.300383087" watchObservedRunningTime="2025-12-03 20:20:42.525339294 +0000 UTC m=+677.338884237" Dec 03 20:20:42.531805 master-0 kubenswrapper[29252]: I1203 20:20:42.527214 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9117e15-fd90-489d-9112-80d5464923af-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.531805 master-0 kubenswrapper[29252]: I1203 20:20:42.527286 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6j8v\" (UniqueName: \"kubernetes.io/projected/b9117e15-fd90-489d-9112-80d5464923af-kube-api-access-x6j8v\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.531805 master-0 kubenswrapper[29252]: I1203 20:20:42.529271 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9117e15-fd90-489d-9112-80d5464923af-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.561904 master-0 kubenswrapper[29252]: I1203 20:20:42.555765 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6j8v\" (UniqueName: \"kubernetes.io/projected/b9117e15-fd90-489d-9112-80d5464923af-kube-api-access-x6j8v\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fpl56\" (UID: \"b9117e15-fd90-489d-9112-80d5464923af\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:42.614803 master-0 kubenswrapper[29252]: I1203 20:20:42.610883 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" Dec 03 20:20:43.114673 master-0 kubenswrapper[29252]: I1203 20:20:43.114620 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56"] Dec 03 20:20:43.118565 master-0 kubenswrapper[29252]: W1203 20:20:43.118487 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9117e15_fd90_489d_9112_80d5464923af.slice/crio-a400fda428e0a0b0fea288a0072daecca0057046e89814d0f0b57716049c5cb5 WatchSource:0}: Error finding container a400fda428e0a0b0fea288a0072daecca0057046e89814d0f0b57716049c5cb5: Status 404 returned error can't find the container with id a400fda428e0a0b0fea288a0072daecca0057046e89814d0f0b57716049c5cb5 Dec 03 20:20:43.492858 master-0 kubenswrapper[29252]: I1203 20:20:43.492458 29252 generic.go:334] "Generic (PLEG): container finished" podID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerID="b02428d2d3e633ae0767803f4fdbbf217685838bc32bcb76239cb326e23348fb" exitCode=0 Dec 03 20:20:43.492858 master-0 kubenswrapper[29252]: I1203 20:20:43.492558 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerDied","Data":"b02428d2d3e633ae0767803f4fdbbf217685838bc32bcb76239cb326e23348fb"} Dec 03 20:20:43.516907 master-0 kubenswrapper[29252]: I1203 20:20:43.514910 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" event={"ID":"b9117e15-fd90-489d-9112-80d5464923af","Type":"ContainerStarted","Data":"a400fda428e0a0b0fea288a0072daecca0057046e89814d0f0b57716049c5cb5"} Dec 03 20:20:45.046046 master-0 kubenswrapper[29252]: I1203 20:20:45.045988 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:45.183585 master-0 kubenswrapper[29252]: I1203 20:20:45.183535 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle\") pod \"571f83ee-075b-466a-b0ff-de17b30b76b8\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " Dec 03 20:20:45.183838 master-0 kubenswrapper[29252]: I1203 20:20:45.183732 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util\") pod \"571f83ee-075b-466a-b0ff-de17b30b76b8\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " Dec 03 20:20:45.183838 master-0 kubenswrapper[29252]: I1203 20:20:45.183801 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sk29\" (UniqueName: \"kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29\") pod \"571f83ee-075b-466a-b0ff-de17b30b76b8\" (UID: \"571f83ee-075b-466a-b0ff-de17b30b76b8\") " Dec 03 20:20:45.187695 master-0 kubenswrapper[29252]: I1203 20:20:45.187626 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle" (OuterVolumeSpecName: "bundle") pod "571f83ee-075b-466a-b0ff-de17b30b76b8" (UID: "571f83ee-075b-466a-b0ff-de17b30b76b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:45.188502 master-0 kubenswrapper[29252]: I1203 20:20:45.188465 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29" (OuterVolumeSpecName: "kube-api-access-5sk29") pod "571f83ee-075b-466a-b0ff-de17b30b76b8" (UID: "571f83ee-075b-466a-b0ff-de17b30b76b8"). InnerVolumeSpecName "kube-api-access-5sk29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:20:45.200186 master-0 kubenswrapper[29252]: I1203 20:20:45.200040 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util" (OuterVolumeSpecName: "util") pod "571f83ee-075b-466a-b0ff-de17b30b76b8" (UID: "571f83ee-075b-466a-b0ff-de17b30b76b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:20:45.285459 master-0 kubenswrapper[29252]: I1203 20:20:45.285346 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:45.285459 master-0 kubenswrapper[29252]: I1203 20:20:45.285389 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sk29\" (UniqueName: \"kubernetes.io/projected/571f83ee-075b-466a-b0ff-de17b30b76b8-kube-api-access-5sk29\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:45.285459 master-0 kubenswrapper[29252]: I1203 20:20:45.285403 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/571f83ee-075b-466a-b0ff-de17b30b76b8-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:20:45.535571 master-0 kubenswrapper[29252]: I1203 20:20:45.535446 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" event={"ID":"571f83ee-075b-466a-b0ff-de17b30b76b8","Type":"ContainerDied","Data":"f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77"} Dec 03 20:20:45.535571 master-0 kubenswrapper[29252]: I1203 20:20:45.535493 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f605548f297084c83bef30b03f1aa3fab9b2fbe1dde25b51c4ab84e5ec3bfb77" Dec 03 20:20:45.535803 master-0 kubenswrapper[29252]: I1203 20:20:45.535578 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210c66s4" Dec 03 20:20:51.602935 master-0 kubenswrapper[29252]: I1203 20:20:51.602848 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" event={"ID":"b9117e15-fd90-489d-9112-80d5464923af","Type":"ContainerStarted","Data":"535bd73b1ee24a516bd2602c6b66b39fa92fcde572a41151afb34a90be82695f"} Dec 03 20:20:51.629078 master-0 kubenswrapper[29252]: I1203 20:20:51.628946 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fpl56" podStartSLOduration=1.908699773 podStartE2EDuration="9.628911948s" podCreationTimestamp="2025-12-03 20:20:42 +0000 UTC" firstStartedPulling="2025-12-03 20:20:43.121708052 +0000 UTC m=+677.935253035" lastFinishedPulling="2025-12-03 20:20:50.841920257 +0000 UTC m=+685.655465210" observedRunningTime="2025-12-03 20:20:51.624664504 +0000 UTC m=+686.438209467" watchObservedRunningTime="2025-12-03 20:20:51.628911948 +0000 UTC m=+686.442456971" Dec 03 20:20:53.888567 master-0 kubenswrapper[29252]: I1203 20:20:53.888497 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-dccxw"] Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: E1203 20:20:53.888821 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="extract" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: I1203 20:20:53.888836 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="extract" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: E1203 20:20:53.888865 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="util" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: I1203 20:20:53.888873 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="util" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: E1203 20:20:53.888897 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="pull" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: I1203 20:20:53.888906 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="pull" Dec 03 20:20:53.889303 master-0 kubenswrapper[29252]: I1203 20:20:53.889263 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f83ee-075b-466a-b0ff-de17b30b76b8" containerName="extract" Dec 03 20:20:53.889815 master-0 kubenswrapper[29252]: I1203 20:20:53.889771 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:53.894191 master-0 kubenswrapper[29252]: I1203 20:20:53.892987 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 20:20:53.894191 master-0 kubenswrapper[29252]: I1203 20:20:53.893508 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 20:20:53.906526 master-0 kubenswrapper[29252]: I1203 20:20:53.906466 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-dccxw"] Dec 03 20:20:53.964859 master-0 kubenswrapper[29252]: I1203 20:20:53.964640 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwf8j\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-kube-api-access-pwf8j\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:53.964859 master-0 kubenswrapper[29252]: I1203 20:20:53.964740 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.066000 master-0 kubenswrapper[29252]: I1203 20:20:54.065939 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwf8j\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-kube-api-access-pwf8j\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.066208 master-0 kubenswrapper[29252]: I1203 20:20:54.066074 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.106562 master-0 kubenswrapper[29252]: I1203 20:20:54.106513 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.106761 master-0 kubenswrapper[29252]: I1203 20:20:54.106687 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwf8j\" (UniqueName: \"kubernetes.io/projected/ad5369d7-39be-4259-a3b3-c67744ea990a-kube-api-access-pwf8j\") pod \"cert-manager-webhook-f4fb5df64-dccxw\" (UID: \"ad5369d7-39be-4259-a3b3-c67744ea990a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.205843 master-0 kubenswrapper[29252]: I1203 20:20:54.205693 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:20:54.655105 master-0 kubenswrapper[29252]: I1203 20:20:54.655061 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-dccxw"] Dec 03 20:20:55.661802 master-0 kubenswrapper[29252]: I1203 20:20:55.659049 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" event={"ID":"ad5369d7-39be-4259-a3b3-c67744ea990a","Type":"ContainerStarted","Data":"bd06efb30d3ac702f084d443c21ecfbdbea61f8f39a37cffbcbe5e03c36a3536"} Dec 03 20:20:55.679707 master-0 kubenswrapper[29252]: I1203 20:20:55.679393 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts"] Dec 03 20:20:55.680491 master-0 kubenswrapper[29252]: I1203 20:20:55.680459 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" Dec 03 20:20:55.685411 master-0 kubenswrapper[29252]: I1203 20:20:55.685362 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 20:20:55.686530 master-0 kubenswrapper[29252]: I1203 20:20:55.685900 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 20:20:55.693429 master-0 kubenswrapper[29252]: I1203 20:20:55.693380 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts"] Dec 03 20:20:55.806454 master-0 kubenswrapper[29252]: I1203 20:20:55.806340 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vt7\" (UniqueName: \"kubernetes.io/projected/bc12a15e-d84d-430c-a33c-833407ab976d-kube-api-access-68vt7\") pod \"nmstate-operator-5b5b58f5c8-l9xts\" (UID: \"bc12a15e-d84d-430c-a33c-833407ab976d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" Dec 03 20:20:55.914570 master-0 kubenswrapper[29252]: I1203 20:20:55.914451 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vt7\" (UniqueName: \"kubernetes.io/projected/bc12a15e-d84d-430c-a33c-833407ab976d-kube-api-access-68vt7\") pod \"nmstate-operator-5b5b58f5c8-l9xts\" (UID: \"bc12a15e-d84d-430c-a33c-833407ab976d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" Dec 03 20:20:55.944048 master-0 kubenswrapper[29252]: I1203 20:20:55.942489 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vt7\" (UniqueName: \"kubernetes.io/projected/bc12a15e-d84d-430c-a33c-833407ab976d-kube-api-access-68vt7\") pod \"nmstate-operator-5b5b58f5c8-l9xts\" (UID: \"bc12a15e-d84d-430c-a33c-833407ab976d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" Dec 03 20:20:55.998561 master-0 kubenswrapper[29252]: I1203 20:20:55.998496 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" Dec 03 20:20:56.487217 master-0 kubenswrapper[29252]: I1203 20:20:56.487151 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts"] Dec 03 20:20:56.666833 master-0 kubenswrapper[29252]: I1203 20:20:56.666761 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" event={"ID":"bc12a15e-d84d-430c-a33c-833407ab976d","Type":"ContainerStarted","Data":"78dce2e8bb12d8fc63a6601b3a4dc0bcb6355cf8bf758f4cf4c542f72d882840"} Dec 03 20:20:57.991121 master-0 kubenswrapper[29252]: I1203 20:20:57.991067 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww"] Dec 03 20:20:57.993988 master-0 kubenswrapper[29252]: I1203 20:20:57.993949 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.002415 master-0 kubenswrapper[29252]: I1203 20:20:57.996437 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 20:20:58.002415 master-0 kubenswrapper[29252]: I1203 20:20:57.996842 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 20:20:58.002415 master-0 kubenswrapper[29252]: I1203 20:20:57.997024 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 20:20:58.002415 master-0 kubenswrapper[29252]: I1203 20:20:57.997195 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 20:20:58.007645 master-0 kubenswrapper[29252]: I1203 20:20:58.006923 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww"] Dec 03 20:20:58.053149 master-0 kubenswrapper[29252]: I1203 20:20:58.053096 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-apiservice-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.053446 master-0 kubenswrapper[29252]: I1203 20:20:58.053431 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-webhook-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.053557 master-0 kubenswrapper[29252]: I1203 20:20:58.053538 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xml\" (UniqueName: \"kubernetes.io/projected/cb6f03f1-adfa-4249-b822-7dd4acf245be-kube-api-access-d8xml\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.155851 master-0 kubenswrapper[29252]: I1203 20:20:58.155573 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-apiservice-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.155851 master-0 kubenswrapper[29252]: I1203 20:20:58.155635 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-webhook-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.155851 master-0 kubenswrapper[29252]: I1203 20:20:58.155656 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xml\" (UniqueName: \"kubernetes.io/projected/cb6f03f1-adfa-4249-b822-7dd4acf245be-kube-api-access-d8xml\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.160327 master-0 kubenswrapper[29252]: I1203 20:20:58.160273 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-apiservice-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.161270 master-0 kubenswrapper[29252]: I1203 20:20:58.161237 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb6f03f1-adfa-4249-b822-7dd4acf245be-webhook-cert\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.172365 master-0 kubenswrapper[29252]: I1203 20:20:58.172307 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xml\" (UniqueName: \"kubernetes.io/projected/cb6f03f1-adfa-4249-b822-7dd4acf245be-kube-api-access-d8xml\") pod \"metallb-operator-controller-manager-7d9499bd6f-4h4ww\" (UID: \"cb6f03f1-adfa-4249-b822-7dd4acf245be\") " pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.319017 master-0 kubenswrapper[29252]: I1203 20:20:58.318805 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv"] Dec 03 20:20:58.319678 master-0 kubenswrapper[29252]: I1203 20:20:58.319655 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.322886 master-0 kubenswrapper[29252]: I1203 20:20:58.322389 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 20:20:58.322886 master-0 kubenswrapper[29252]: I1203 20:20:58.322548 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 20:20:58.362456 master-0 kubenswrapper[29252]: I1203 20:20:58.362365 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-apiservice-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.362697 master-0 kubenswrapper[29252]: I1203 20:20:58.362651 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6fdz\" (UniqueName: \"kubernetes.io/projected/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-kube-api-access-z6fdz\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.365704 master-0 kubenswrapper[29252]: I1203 20:20:58.365658 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-webhook-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.365967 master-0 kubenswrapper[29252]: I1203 20:20:58.365928 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:20:58.376032 master-0 kubenswrapper[29252]: I1203 20:20:58.374972 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv"] Dec 03 20:20:58.472083 master-0 kubenswrapper[29252]: I1203 20:20:58.467615 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-apiservice-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.472083 master-0 kubenswrapper[29252]: I1203 20:20:58.467716 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6fdz\" (UniqueName: \"kubernetes.io/projected/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-kube-api-access-z6fdz\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.472083 master-0 kubenswrapper[29252]: I1203 20:20:58.467753 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-webhook-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.472696 master-0 kubenswrapper[29252]: I1203 20:20:58.472665 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-webhook-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.472929 master-0 kubenswrapper[29252]: I1203 20:20:58.472905 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-apiservice-cert\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.489308 master-0 kubenswrapper[29252]: I1203 20:20:58.489258 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6fdz\" (UniqueName: \"kubernetes.io/projected/5a2dc21d-ada3-4739-9e62-cbdba8e4985a-kube-api-access-z6fdz\") pod \"metallb-operator-webhook-server-bfbbd6984-sx9nv\" (UID: \"5a2dc21d-ada3-4739-9e62-cbdba8e4985a\") " pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.667859 master-0 kubenswrapper[29252]: I1203 20:20:58.667795 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:20:58.993183 master-0 kubenswrapper[29252]: I1203 20:20:58.993124 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww"] Dec 03 20:21:00.408799 master-0 kubenswrapper[29252]: I1203 20:21:00.408677 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t"] Dec 03 20:21:00.410249 master-0 kubenswrapper[29252]: I1203 20:21:00.410191 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.440762 master-0 kubenswrapper[29252]: I1203 20:21:00.440295 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t"] Dec 03 20:21:00.527915 master-0 kubenswrapper[29252]: I1203 20:21:00.527857 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.528142 master-0 kubenswrapper[29252]: I1203 20:21:00.528118 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjx7p\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-kube-api-access-kjx7p\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.634797 master-0 kubenswrapper[29252]: I1203 20:21:00.632200 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjx7p\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-kube-api-access-kjx7p\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.634797 master-0 kubenswrapper[29252]: I1203 20:21:00.632283 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.655411 master-0 kubenswrapper[29252]: I1203 20:21:00.655348 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.655650 master-0 kubenswrapper[29252]: I1203 20:21:00.655506 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjx7p\" (UniqueName: \"kubernetes.io/projected/f07a0e83-7142-455c-bbbd-5b7b10b03bc0-kube-api-access-kjx7p\") pod \"cert-manager-cainjector-855d9ccff4-fdb6t\" (UID: \"f07a0e83-7142-455c-bbbd-5b7b10b03bc0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:00.758238 master-0 kubenswrapper[29252]: I1203 20:21:00.758108 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" Dec 03 20:21:03.653164 master-0 kubenswrapper[29252]: I1203 20:21:03.653066 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-fzh89"] Dec 03 20:21:03.658307 master-0 kubenswrapper[29252]: I1203 20:21:03.658255 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.694088 master-0 kubenswrapper[29252]: I1203 20:21:03.680231 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-fzh89"] Dec 03 20:21:03.839202 master-0 kubenswrapper[29252]: I1203 20:21:03.839141 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmdp\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-kube-api-access-bsmdp\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.839508 master-0 kubenswrapper[29252]: I1203 20:21:03.839221 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-bound-sa-token\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.941547 master-0 kubenswrapper[29252]: I1203 20:21:03.941404 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmdp\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-kube-api-access-bsmdp\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.941547 master-0 kubenswrapper[29252]: I1203 20:21:03.941485 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-bound-sa-token\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.972031 master-0 kubenswrapper[29252]: I1203 20:21:03.971326 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmdp\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-kube-api-access-bsmdp\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.973903 master-0 kubenswrapper[29252]: I1203 20:21:03.973590 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78275772-3b78-4283-8b15-f28695b4a15f-bound-sa-token\") pod \"cert-manager-86cb77c54b-fzh89\" (UID: \"78275772-3b78-4283-8b15-f28695b4a15f\") " pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:03.995687 master-0 kubenswrapper[29252]: I1203 20:21:03.995221 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-fzh89" Dec 03 20:21:05.500747 master-0 kubenswrapper[29252]: W1203 20:21:05.500679 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6f03f1_adfa_4249_b822_7dd4acf245be.slice/crio-f760366793475c465d5d9065e096b26c6d54c420a0f7f56dfb18e579892ee102 WatchSource:0}: Error finding container f760366793475c465d5d9065e096b26c6d54c420a0f7f56dfb18e579892ee102: Status 404 returned error can't find the container with id f760366793475c465d5d9065e096b26c6d54c420a0f7f56dfb18e579892ee102 Dec 03 20:21:05.816953 master-0 kubenswrapper[29252]: I1203 20:21:05.816764 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" event={"ID":"cb6f03f1-adfa-4249-b822-7dd4acf245be","Type":"ContainerStarted","Data":"f760366793475c465d5d9065e096b26c6d54c420a0f7f56dfb18e579892ee102"} Dec 03 20:21:07.020562 master-0 kubenswrapper[29252]: I1203 20:21:07.020504 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt"] Dec 03 20:21:07.021429 master-0 kubenswrapper[29252]: I1203 20:21:07.021407 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" Dec 03 20:21:07.024097 master-0 kubenswrapper[29252]: I1203 20:21:07.024047 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 20:21:07.024200 master-0 kubenswrapper[29252]: I1203 20:21:07.024148 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 20:21:07.038554 master-0 kubenswrapper[29252]: I1203 20:21:07.038503 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt"] Dec 03 20:21:07.118228 master-0 kubenswrapper[29252]: I1203 20:21:07.114346 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqcv\" (UniqueName: \"kubernetes.io/projected/907128fd-4fcf-46cc-b294-19424448ccc9-kube-api-access-qgqcv\") pod \"obo-prometheus-operator-668cf9dfbb-8dbtt\" (UID: \"907128fd-4fcf-46cc-b294-19424448ccc9\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" Dec 03 20:21:07.158047 master-0 kubenswrapper[29252]: I1203 20:21:07.157998 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275"] Dec 03 20:21:07.159455 master-0 kubenswrapper[29252]: I1203 20:21:07.159421 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.163803 master-0 kubenswrapper[29252]: I1203 20:21:07.162020 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 20:21:07.169723 master-0 kubenswrapper[29252]: I1203 20:21:07.169675 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc"] Dec 03 20:21:07.171503 master-0 kubenswrapper[29252]: I1203 20:21:07.170967 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.186956 master-0 kubenswrapper[29252]: I1203 20:21:07.178851 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275"] Dec 03 20:21:07.198800 master-0 kubenswrapper[29252]: I1203 20:21:07.198668 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc"] Dec 03 20:21:07.225804 master-0 kubenswrapper[29252]: I1203 20:21:07.219973 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqcv\" (UniqueName: \"kubernetes.io/projected/907128fd-4fcf-46cc-b294-19424448ccc9-kube-api-access-qgqcv\") pod \"obo-prometheus-operator-668cf9dfbb-8dbtt\" (UID: \"907128fd-4fcf-46cc-b294-19424448ccc9\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" Dec 03 20:21:07.253714 master-0 kubenswrapper[29252]: I1203 20:21:07.253654 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqcv\" (UniqueName: \"kubernetes.io/projected/907128fd-4fcf-46cc-b294-19424448ccc9-kube-api-access-qgqcv\") pod \"obo-prometheus-operator-668cf9dfbb-8dbtt\" (UID: \"907128fd-4fcf-46cc-b294-19424448ccc9\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" Dec 03 20:21:07.323929 master-0 kubenswrapper[29252]: I1203 20:21:07.322652 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.323929 master-0 kubenswrapper[29252]: I1203 20:21:07.322735 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.323929 master-0 kubenswrapper[29252]: I1203 20:21:07.322791 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.323929 master-0 kubenswrapper[29252]: I1203 20:21:07.322816 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.351880 master-0 kubenswrapper[29252]: I1203 20:21:07.351821 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" Dec 03 20:21:07.391660 master-0 kubenswrapper[29252]: I1203 20:21:07.388048 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rnk7l"] Dec 03 20:21:07.391660 master-0 kubenswrapper[29252]: I1203 20:21:07.389209 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.398638 master-0 kubenswrapper[29252]: I1203 20:21:07.398584 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rnk7l"] Dec 03 20:21:07.413025 master-0 kubenswrapper[29252]: I1203 20:21:07.402284 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 20:21:07.426530 master-0 kubenswrapper[29252]: I1203 20:21:07.425829 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.426530 master-0 kubenswrapper[29252]: I1203 20:21:07.425888 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.426530 master-0 kubenswrapper[29252]: I1203 20:21:07.425915 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.426530 master-0 kubenswrapper[29252]: I1203 20:21:07.425936 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.429392 master-0 kubenswrapper[29252]: I1203 20:21:07.429316 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.430312 master-0 kubenswrapper[29252]: I1203 20:21:07.430268 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.443862 master-0 kubenswrapper[29252]: I1203 20:21:07.432000 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48bd42d2-ab48-4b98-86ce-948b7b70b781-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-54275\" (UID: \"48bd42d2-ab48-4b98-86ce-948b7b70b781\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.444917 master-0 kubenswrapper[29252]: I1203 20:21:07.444844 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608d3e20-ad16-4dbd-a829-5bfe8e6f345c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f66756b69-22wkc\" (UID: \"608d3e20-ad16-4dbd-a829-5bfe8e6f345c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.500849 master-0 kubenswrapper[29252]: I1203 20:21:07.500291 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" Dec 03 20:21:07.527283 master-0 kubenswrapper[29252]: I1203 20:21:07.527218 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" Dec 03 20:21:07.527927 master-0 kubenswrapper[29252]: I1203 20:21:07.527903 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqwm\" (UniqueName: \"kubernetes.io/projected/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-kube-api-access-bwqwm\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.528285 master-0 kubenswrapper[29252]: I1203 20:21:07.528162 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.534383 master-0 kubenswrapper[29252]: I1203 20:21:07.533202 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glb28"] Dec 03 20:21:07.534383 master-0 kubenswrapper[29252]: I1203 20:21:07.534111 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.559479 master-0 kubenswrapper[29252]: I1203 20:21:07.559323 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glb28"] Dec 03 20:21:07.633246 master-0 kubenswrapper[29252]: I1203 20:21:07.633164 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.633474 master-0 kubenswrapper[29252]: I1203 20:21:07.633272 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e412375-a8c7-4167-adc2-7e8054c3bf4a-openshift-service-ca\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.633474 master-0 kubenswrapper[29252]: I1203 20:21:07.633315 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqwm\" (UniqueName: \"kubernetes.io/projected/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-kube-api-access-bwqwm\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.633474 master-0 kubenswrapper[29252]: I1203 20:21:07.633346 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbm9r\" (UniqueName: \"kubernetes.io/projected/2e412375-a8c7-4167-adc2-7e8054c3bf4a-kube-api-access-zbm9r\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.659425 master-0 kubenswrapper[29252]: I1203 20:21:07.659383 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.679680 master-0 kubenswrapper[29252]: I1203 20:21:07.679626 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqwm\" (UniqueName: \"kubernetes.io/projected/ad1d4a4f-7aab-4033-b132-5f30d7c5b76a-kube-api-access-bwqwm\") pod \"observability-operator-d8bb48f5d-rnk7l\" (UID: \"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a\") " pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.737827 master-0 kubenswrapper[29252]: I1203 20:21:07.737754 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e412375-a8c7-4167-adc2-7e8054c3bf4a-openshift-service-ca\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.738120 master-0 kubenswrapper[29252]: I1203 20:21:07.738101 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbm9r\" (UniqueName: \"kubernetes.io/projected/2e412375-a8c7-4167-adc2-7e8054c3bf4a-kube-api-access-zbm9r\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.739401 master-0 kubenswrapper[29252]: I1203 20:21:07.739380 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2e412375-a8c7-4167-adc2-7e8054c3bf4a-openshift-service-ca\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.767250 master-0 kubenswrapper[29252]: I1203 20:21:07.765612 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbm9r\" (UniqueName: \"kubernetes.io/projected/2e412375-a8c7-4167-adc2-7e8054c3bf4a-kube-api-access-zbm9r\") pod \"perses-operator-5446b9c989-glb28\" (UID: \"2e412375-a8c7-4167-adc2-7e8054c3bf4a\") " pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:07.787479 master-0 kubenswrapper[29252]: I1203 20:21:07.787123 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:07.858161 master-0 kubenswrapper[29252]: I1203 20:21:07.858093 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:11.153840 master-0 kubenswrapper[29252]: I1203 20:21:11.153766 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t"] Dec 03 20:21:11.167505 master-0 kubenswrapper[29252]: W1203 20:21:11.167437 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07a0e83_7142_455c_bbbd_5b7b10b03bc0.slice/crio-6d6ac4ec24668d615cc5555ae50ebece089b28e70faa4feeca784ca637063bb0 WatchSource:0}: Error finding container 6d6ac4ec24668d615cc5555ae50ebece089b28e70faa4feeca784ca637063bb0: Status 404 returned error can't find the container with id 6d6ac4ec24668d615cc5555ae50ebece089b28e70faa4feeca784ca637063bb0 Dec 03 20:21:11.479149 master-0 kubenswrapper[29252]: I1203 20:21:11.479037 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glb28"] Dec 03 20:21:11.506498 master-0 kubenswrapper[29252]: I1203 20:21:11.506446 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rnk7l"] Dec 03 20:21:11.543496 master-0 kubenswrapper[29252]: I1203 20:21:11.540862 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt"] Dec 03 20:21:11.547519 master-0 kubenswrapper[29252]: I1203 20:21:11.547382 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv"] Dec 03 20:21:11.563429 master-0 kubenswrapper[29252]: I1203 20:21:11.563397 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275"] Dec 03 20:21:11.576085 master-0 kubenswrapper[29252]: I1203 20:21:11.575991 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc"] Dec 03 20:21:11.654855 master-0 kubenswrapper[29252]: I1203 20:21:11.654809 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-fzh89"] Dec 03 20:21:11.661036 master-0 kubenswrapper[29252]: W1203 20:21:11.660727 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78275772_3b78_4283_8b15_f28695b4a15f.slice/crio-176fc1f2d860e2c199d4e722c43f68133469955f183c489595637a055411c5f2 WatchSource:0}: Error finding container 176fc1f2d860e2c199d4e722c43f68133469955f183c489595637a055411c5f2: Status 404 returned error can't find the container with id 176fc1f2d860e2c199d4e722c43f68133469955f183c489595637a055411c5f2 Dec 03 20:21:11.887611 master-0 kubenswrapper[29252]: I1203 20:21:11.887534 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" event={"ID":"f07a0e83-7142-455c-bbbd-5b7b10b03bc0","Type":"ContainerStarted","Data":"7b60bf230dab808efe55bd2c6d68e2c982e706320dc73bd3066034d23031d3d4"} Dec 03 20:21:11.887611 master-0 kubenswrapper[29252]: I1203 20:21:11.887599 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" event={"ID":"f07a0e83-7142-455c-bbbd-5b7b10b03bc0","Type":"ContainerStarted","Data":"6d6ac4ec24668d615cc5555ae50ebece089b28e70faa4feeca784ca637063bb0"} Dec 03 20:21:11.890013 master-0 kubenswrapper[29252]: I1203 20:21:11.889940 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" event={"ID":"608d3e20-ad16-4dbd-a829-5bfe8e6f345c","Type":"ContainerStarted","Data":"f841fca50a065388922510b9c6ea94a5507d3dbc301fda470abafa5827660a72"} Dec 03 20:21:11.892536 master-0 kubenswrapper[29252]: I1203 20:21:11.892504 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" event={"ID":"bc12a15e-d84d-430c-a33c-833407ab976d","Type":"ContainerStarted","Data":"8b49631833272e7168006477f5870ed2a65347025edb48c38a4edeb60c277f8b"} Dec 03 20:21:11.894828 master-0 kubenswrapper[29252]: I1203 20:21:11.894799 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" event={"ID":"5a2dc21d-ada3-4739-9e62-cbdba8e4985a","Type":"ContainerStarted","Data":"12b465d546b091bc939b5097c966b71e4cea83e958240c32397963480d90138b"} Dec 03 20:21:11.895908 master-0 kubenswrapper[29252]: I1203 20:21:11.895878 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" event={"ID":"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a","Type":"ContainerStarted","Data":"8dbc04b21d0c2c2d9aa2d8ee4518037ed36e413b8697c897cc037e8b6fdd369d"} Dec 03 20:21:11.897288 master-0 kubenswrapper[29252]: I1203 20:21:11.897252 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-glb28" event={"ID":"2e412375-a8c7-4167-adc2-7e8054c3bf4a","Type":"ContainerStarted","Data":"632cf023d52baa6ffeb2b61b5209f299eb78aaf3504a5e184f2ca0d1cd76a504"} Dec 03 20:21:11.900945 master-0 kubenswrapper[29252]: I1203 20:21:11.900919 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-fzh89" event={"ID":"78275772-3b78-4283-8b15-f28695b4a15f","Type":"ContainerStarted","Data":"92240e3acc40145d55d9d8784aa78875c9a0c63e811092afae9b4724d34da8d5"} Dec 03 20:21:11.900945 master-0 kubenswrapper[29252]: I1203 20:21:11.900942 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-fzh89" event={"ID":"78275772-3b78-4283-8b15-f28695b4a15f","Type":"ContainerStarted","Data":"176fc1f2d860e2c199d4e722c43f68133469955f183c489595637a055411c5f2"} Dec 03 20:21:11.911017 master-0 kubenswrapper[29252]: I1203 20:21:11.910949 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" event={"ID":"907128fd-4fcf-46cc-b294-19424448ccc9","Type":"ContainerStarted","Data":"1f0a976a4a9945cc181ab953395d2159206a6ce9c1b7d569575d341e149b698b"} Dec 03 20:21:11.922024 master-0 kubenswrapper[29252]: I1203 20:21:11.920982 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" event={"ID":"ad5369d7-39be-4259-a3b3-c67744ea990a","Type":"ContainerStarted","Data":"d90a7c5ce529332506c48d7ac1565f91c73b7657717ae58ac913548d63ea504c"} Dec 03 20:21:11.922024 master-0 kubenswrapper[29252]: I1203 20:21:11.921160 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:21:11.923703 master-0 kubenswrapper[29252]: I1203 20:21:11.923645 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" event={"ID":"48bd42d2-ab48-4b98-86ce-948b7b70b781","Type":"ContainerStarted","Data":"f956bacca619554e9cb71acba19195733f5c3e4ac5636278b84d9f238161bdda"} Dec 03 20:21:11.950092 master-0 kubenswrapper[29252]: I1203 20:21:11.950001 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-fdb6t" podStartSLOduration=11.949956738000001 podStartE2EDuration="11.949956738s" podCreationTimestamp="2025-12-03 20:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:21:11.90704343 +0000 UTC m=+706.720588403" watchObservedRunningTime="2025-12-03 20:21:11.949956738 +0000 UTC m=+706.763501691" Dec 03 20:21:11.955211 master-0 kubenswrapper[29252]: I1203 20:21:11.954758 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-l9xts" podStartSLOduration=2.852252804 podStartE2EDuration="16.954740135s" podCreationTimestamp="2025-12-03 20:20:55 +0000 UTC" firstStartedPulling="2025-12-03 20:20:56.49898155 +0000 UTC m=+691.312526513" lastFinishedPulling="2025-12-03 20:21:10.601468891 +0000 UTC m=+705.415013844" observedRunningTime="2025-12-03 20:21:11.951396883 +0000 UTC m=+706.764941846" watchObservedRunningTime="2025-12-03 20:21:11.954740135 +0000 UTC m=+706.768285088" Dec 03 20:21:11.981536 master-0 kubenswrapper[29252]: I1203 20:21:11.979289 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-fzh89" podStartSLOduration=8.979268154 podStartE2EDuration="8.979268154s" podCreationTimestamp="2025-12-03 20:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:21:11.978760272 +0000 UTC m=+706.792305235" watchObservedRunningTime="2025-12-03 20:21:11.979268154 +0000 UTC m=+706.792813117" Dec 03 20:21:12.022954 master-0 kubenswrapper[29252]: I1203 20:21:12.022166 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" podStartSLOduration=3.088761116 podStartE2EDuration="19.02208992s" podCreationTimestamp="2025-12-03 20:20:53 +0000 UTC" firstStartedPulling="2025-12-03 20:20:54.659523887 +0000 UTC m=+689.473068850" lastFinishedPulling="2025-12-03 20:21:10.592852701 +0000 UTC m=+705.406397654" observedRunningTime="2025-12-03 20:21:12.013008157 +0000 UTC m=+706.826553120" watchObservedRunningTime="2025-12-03 20:21:12.02208992 +0000 UTC m=+706.835634893" Dec 03 20:21:13.943476 master-0 kubenswrapper[29252]: I1203 20:21:13.943401 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" event={"ID":"cb6f03f1-adfa-4249-b822-7dd4acf245be","Type":"ContainerStarted","Data":"fe1691d51fcc1c992f372924dd207e5f761178cbd151dc208e59f9d0debbad63"} Dec 03 20:21:13.944210 master-0 kubenswrapper[29252]: I1203 20:21:13.943722 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:21:13.988441 master-0 kubenswrapper[29252]: I1203 20:21:13.988321 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" podStartSLOduration=9.317921767 podStartE2EDuration="16.988298536s" podCreationTimestamp="2025-12-03 20:20:57 +0000 UTC" firstStartedPulling="2025-12-03 20:21:05.510811674 +0000 UTC m=+700.324356627" lastFinishedPulling="2025-12-03 20:21:13.181188443 +0000 UTC m=+707.994733396" observedRunningTime="2025-12-03 20:21:13.965732575 +0000 UTC m=+708.779277528" watchObservedRunningTime="2025-12-03 20:21:13.988298536 +0000 UTC m=+708.801843489" Dec 03 20:21:19.210225 master-0 kubenswrapper[29252]: I1203 20:21:19.210132 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-dccxw" Dec 03 20:21:20.013788 master-0 kubenswrapper[29252]: I1203 20:21:20.013727 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" event={"ID":"ad1d4a4f-7aab-4033-b132-5f30d7c5b76a","Type":"ContainerStarted","Data":"9e3362b589d979af1a214a532956a5903bb798c03270806e76973624e414e211"} Dec 03 20:21:20.014261 master-0 kubenswrapper[29252]: I1203 20:21:20.014223 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:20.015930 master-0 kubenswrapper[29252]: I1203 20:21:20.015873 29252 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-rnk7l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.128.0.136:8081/healthz\": dial tcp 10.128.0.136:8081: connect: connection refused" start-of-body= Dec 03 20:21:20.015930 master-0 kubenswrapper[29252]: I1203 20:21:20.015919 29252 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" podUID="ad1d4a4f-7aab-4033-b132-5f30d7c5b76a" containerName="operator" probeResult="failure" output="Get \"http://10.128.0.136:8081/healthz\": dial tcp 10.128.0.136:8081: connect: connection refused" Dec 03 20:21:20.050620 master-0 kubenswrapper[29252]: I1203 20:21:20.050455 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" podStartSLOduration=5.048448704 podStartE2EDuration="13.050428777s" podCreationTimestamp="2025-12-03 20:21:07 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.564558991 +0000 UTC m=+706.378103944" lastFinishedPulling="2025-12-03 20:21:19.566539064 +0000 UTC m=+714.380084017" observedRunningTime="2025-12-03 20:21:20.046706966 +0000 UTC m=+714.860251939" watchObservedRunningTime="2025-12-03 20:21:20.050428777 +0000 UTC m=+714.863973740" Dec 03 20:21:21.022463 master-0 kubenswrapper[29252]: I1203 20:21:21.022369 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" event={"ID":"907128fd-4fcf-46cc-b294-19424448ccc9","Type":"ContainerStarted","Data":"10319e185e6491572bd9cfa349ece1857ac687ea055257104013e8034679f7fa"} Dec 03 20:21:21.024183 master-0 kubenswrapper[29252]: I1203 20:21:21.024129 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" event={"ID":"608d3e20-ad16-4dbd-a829-5bfe8e6f345c","Type":"ContainerStarted","Data":"7c0f295976f73edc2d5e1b2f101ec274f045994ea5d8202f79d387d15b1011fb"} Dec 03 20:21:21.026314 master-0 kubenswrapper[29252]: I1203 20:21:21.026244 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" event={"ID":"48bd42d2-ab48-4b98-86ce-948b7b70b781","Type":"ContainerStarted","Data":"48eaf21df77e0849d64a605cfeec75809b3c0fc7eeafd7ebfcdbc450e1bb6cac"} Dec 03 20:21:21.029399 master-0 kubenswrapper[29252]: I1203 20:21:21.029127 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" event={"ID":"5a2dc21d-ada3-4739-9e62-cbdba8e4985a","Type":"ContainerStarted","Data":"4e09a0bf3ba1ccc45b5f21d22f13d5bb356f2d18ff527d3a0c490907150fc59b"} Dec 03 20:21:21.029560 master-0 kubenswrapper[29252]: I1203 20:21:21.029405 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:21:21.031186 master-0 kubenswrapper[29252]: I1203 20:21:21.031121 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-glb28" event={"ID":"2e412375-a8c7-4167-adc2-7e8054c3bf4a","Type":"ContainerStarted","Data":"5a7ecd71ccab60177202a014ac5f1525766a65a0a637abbfe38590e2f01fa1c5"} Dec 03 20:21:21.035496 master-0 kubenswrapper[29252]: I1203 20:21:21.035447 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-rnk7l" Dec 03 20:21:21.047979 master-0 kubenswrapper[29252]: I1203 20:21:21.047878 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-8dbtt" podStartSLOduration=7.085162759 podStartE2EDuration="15.047853814s" podCreationTimestamp="2025-12-03 20:21:06 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.554908075 +0000 UTC m=+706.368453018" lastFinishedPulling="2025-12-03 20:21:19.51759911 +0000 UTC m=+714.331144073" observedRunningTime="2025-12-03 20:21:21.04030896 +0000 UTC m=+715.853853933" watchObservedRunningTime="2025-12-03 20:21:21.047853814 +0000 UTC m=+715.861398777" Dec 03 20:21:21.066281 master-0 kubenswrapper[29252]: I1203 20:21:21.066193 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-22wkc" podStartSLOduration=6.19375279 podStartE2EDuration="14.066176442s" podCreationTimestamp="2025-12-03 20:21:07 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.564710654 +0000 UTC m=+706.378255607" lastFinishedPulling="2025-12-03 20:21:19.437134306 +0000 UTC m=+714.250679259" observedRunningTime="2025-12-03 20:21:21.061506088 +0000 UTC m=+715.875051071" watchObservedRunningTime="2025-12-03 20:21:21.066176442 +0000 UTC m=+715.879721395" Dec 03 20:21:21.106389 master-0 kubenswrapper[29252]: I1203 20:21:21.106273 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-glb28" podStartSLOduration=6.110413896 podStartE2EDuration="14.10624383s" podCreationTimestamp="2025-12-03 20:21:07 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.521648043 +0000 UTC m=+706.335192996" lastFinishedPulling="2025-12-03 20:21:19.517477977 +0000 UTC m=+714.331022930" observedRunningTime="2025-12-03 20:21:21.098360457 +0000 UTC m=+715.911905430" watchObservedRunningTime="2025-12-03 20:21:21.10624383 +0000 UTC m=+715.919788803" Dec 03 20:21:21.148867 master-0 kubenswrapper[29252]: I1203 20:21:21.148759 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f66756b69-54275" podStartSLOduration=6.276120381 podStartE2EDuration="14.148737147s" podCreationTimestamp="2025-12-03 20:21:07 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.5645195 +0000 UTC m=+706.378064453" lastFinishedPulling="2025-12-03 20:21:19.437136266 +0000 UTC m=+714.250681219" observedRunningTime="2025-12-03 20:21:21.127452588 +0000 UTC m=+715.940997551" watchObservedRunningTime="2025-12-03 20:21:21.148737147 +0000 UTC m=+715.962282100" Dec 03 20:21:21.162417 master-0 kubenswrapper[29252]: I1203 20:21:21.161699 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" podStartSLOduration=15.145518063 podStartE2EDuration="23.161679033s" podCreationTimestamp="2025-12-03 20:20:58 +0000 UTC" firstStartedPulling="2025-12-03 20:21:11.558359099 +0000 UTC m=+706.371904052" lastFinishedPulling="2025-12-03 20:21:19.574520069 +0000 UTC m=+714.388065022" observedRunningTime="2025-12-03 20:21:21.160960365 +0000 UTC m=+715.974505328" watchObservedRunningTime="2025-12-03 20:21:21.161679033 +0000 UTC m=+715.975224006" Dec 03 20:21:22.042121 master-0 kubenswrapper[29252]: I1203 20:21:22.042017 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:27.861316 master-0 kubenswrapper[29252]: I1203 20:21:27.861251 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-glb28" Dec 03 20:21:27.954588 master-0 kubenswrapper[29252]: I1203 20:21:27.954492 29252 scope.go:117] "RemoveContainer" containerID="1451a79631eaf16c9eb478f51661577aa37eaea15ed18eb83a425743c7c87e7e" Dec 03 20:21:27.988100 master-0 kubenswrapper[29252]: I1203 20:21:27.988032 29252 scope.go:117] "RemoveContainer" containerID="34ce07503848cd6aad62ba91f3c407cfb5a322733a238fe0815d01f74c614873" Dec 03 20:21:28.010652 master-0 kubenswrapper[29252]: I1203 20:21:28.010572 29252 scope.go:117] "RemoveContainer" containerID="0976a89f464ebb972c93a46088e9eeb54bd3bcf4771fafb2ab2a84f679391cfb" Dec 03 20:21:28.064673 master-0 kubenswrapper[29252]: I1203 20:21:28.063249 29252 scope.go:117] "RemoveContainer" containerID="47a401e5c33c18bb1cfb970151b713d5420adf0b84eb4a88be63ba450bd5a61b" Dec 03 20:21:28.089215 master-0 kubenswrapper[29252]: I1203 20:21:28.089110 29252 scope.go:117] "RemoveContainer" containerID="a47d8d5fc9fb8d8f5c161e8c2f4a0a8e14e1a13017007439234149dd1f6a68f4" Dec 03 20:21:28.114465 master-0 kubenswrapper[29252]: I1203 20:21:28.114372 29252 scope.go:117] "RemoveContainer" containerID="41edc0c3867479b941ec31180dac8bca736f22ef5242e5d1acff2ee882afe88a" Dec 03 20:21:28.145149 master-0 kubenswrapper[29252]: I1203 20:21:28.145082 29252 scope.go:117] "RemoveContainer" containerID="2db45bdca5ac382650099e41ec380f87182a50ddf6fab9295a34b23e8201c999" Dec 03 20:21:38.674665 master-0 kubenswrapper[29252]: I1203 20:21:38.674585 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-bfbbd6984-sx9nv" Dec 03 20:21:48.368647 master-0 kubenswrapper[29252]: I1203 20:21:48.368548 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d9499bd6f-4h4ww" Dec 03 20:21:53.952378 master-0 kubenswrapper[29252]: I1203 20:21:53.952318 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc"] Dec 03 20:21:53.953643 master-0 kubenswrapper[29252]: I1203 20:21:53.953593 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:53.956502 master-0 kubenswrapper[29252]: I1203 20:21:53.956456 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 20:21:53.967174 master-0 kubenswrapper[29252]: I1203 20:21:53.967094 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-c7bbm"] Dec 03 20:21:53.971096 master-0 kubenswrapper[29252]: I1203 20:21:53.970997 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:53.974522 master-0 kubenswrapper[29252]: I1203 20:21:53.973927 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 20:21:53.976916 master-0 kubenswrapper[29252]: I1203 20:21:53.976860 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 20:21:53.978562 master-0 kubenswrapper[29252]: I1203 20:21:53.978527 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc"] Dec 03 20:21:54.072774 master-0 kubenswrapper[29252]: I1203 20:21:54.070594 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-j26tj"] Dec 03 20:21:54.072774 master-0 kubenswrapper[29252]: I1203 20:21:54.072133 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.078450 master-0 kubenswrapper[29252]: I1203 20:21:54.078367 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 20:21:54.085987 master-0 kubenswrapper[29252]: I1203 20:21:54.085912 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 20:21:54.086208 master-0 kubenswrapper[29252]: I1203 20:21:54.086053 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 20:21:54.086717 master-0 kubenswrapper[29252]: I1203 20:21:54.086659 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-96nlf"] Dec 03 20:21:54.088241 master-0 kubenswrapper[29252]: I1203 20:21:54.088218 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.106806 master-0 kubenswrapper[29252]: I1203 20:21:54.106724 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 20:21:54.107185 master-0 kubenswrapper[29252]: I1203 20:21:54.107108 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bebbe36-46ba-47e9-b53e-2c83abe9c329-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.107260 master-0 kubenswrapper[29252]: I1203 20:21:54.107196 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-conf\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107260 master-0 kubenswrapper[29252]: I1203 20:21:54.107220 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-metrics\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107260 master-0 kubenswrapper[29252]: I1203 20:21:54.107253 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-reloader\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107423 master-0 kubenswrapper[29252]: I1203 20:21:54.107292 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r8p\" (UniqueName: \"kubernetes.io/projected/2bebbe36-46ba-47e9-b53e-2c83abe9c329-kube-api-access-b9r8p\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.107423 master-0 kubenswrapper[29252]: I1203 20:21:54.107311 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4977c492-5b52-447d-ab42-4a70601a0da4-metrics-certs\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107423 master-0 kubenswrapper[29252]: I1203 20:21:54.107400 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4977c492-5b52-447d-ab42-4a70601a0da4-frr-startup\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107556 master-0 kubenswrapper[29252]: I1203 20:21:54.107436 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9x4m\" (UniqueName: \"kubernetes.io/projected/4977c492-5b52-447d-ab42-4a70601a0da4-kube-api-access-w9x4m\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.107556 master-0 kubenswrapper[29252]: I1203 20:21:54.107459 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-sockets\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.126579 master-0 kubenswrapper[29252]: I1203 20:21:54.126463 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-96nlf"] Dec 03 20:21:54.208694 master-0 kubenswrapper[29252]: I1203 20:21:54.208554 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/52511992-a397-485f-b709-f81257ee8e16-metallb-excludel2\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.208694 master-0 kubenswrapper[29252]: I1203 20:21:54.208627 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4977c492-5b52-447d-ab42-4a70601a0da4-frr-startup\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208697 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxj2m\" (UniqueName: \"kubernetes.io/projected/1ae49184-91db-4355-b553-8cc5506e80bc-kube-api-access-pxj2m\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208737 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9x4m\" (UniqueName: \"kubernetes.io/projected/4977c492-5b52-447d-ab42-4a70601a0da4-kube-api-access-w9x4m\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208801 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-sockets\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208856 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bebbe36-46ba-47e9-b53e-2c83abe9c329-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208951 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-conf\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.208980 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-metrics\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.209010 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-metrics-certs\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.209040 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-reloader\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.209064 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr4wl\" (UniqueName: \"kubernetes.io/projected/52511992-a397-485f-b709-f81257ee8e16-kube-api-access-tr4wl\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.209106 master-0 kubenswrapper[29252]: I1203 20:21:54.209093 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r8p\" (UniqueName: \"kubernetes.io/projected/2bebbe36-46ba-47e9-b53e-2c83abe9c329-kube-api-access-b9r8p\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.209569 master-0 kubenswrapper[29252]: I1203 20:21:54.209119 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4977c492-5b52-447d-ab42-4a70601a0da4-metrics-certs\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209569 master-0 kubenswrapper[29252]: I1203 20:21:54.209152 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.209569 master-0 kubenswrapper[29252]: I1203 20:21:54.209216 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-metrics-certs\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.209569 master-0 kubenswrapper[29252]: I1203 20:21:54.209260 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-cert\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.209773 master-0 kubenswrapper[29252]: I1203 20:21:54.209598 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-sockets\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.209852 master-0 kubenswrapper[29252]: I1203 20:21:54.209789 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-reloader\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.210010 master-0 kubenswrapper[29252]: I1203 20:21:54.209971 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-frr-conf\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.210092 master-0 kubenswrapper[29252]: I1203 20:21:54.210060 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4977c492-5b52-447d-ab42-4a70601a0da4-frr-startup\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.210350 master-0 kubenswrapper[29252]: I1203 20:21:54.210327 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4977c492-5b52-447d-ab42-4a70601a0da4-metrics\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.217021 master-0 kubenswrapper[29252]: I1203 20:21:54.213190 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bebbe36-46ba-47e9-b53e-2c83abe9c329-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.217021 master-0 kubenswrapper[29252]: I1203 20:21:54.216587 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4977c492-5b52-447d-ab42-4a70601a0da4-metrics-certs\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.231931 master-0 kubenswrapper[29252]: I1203 20:21:54.231859 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r8p\" (UniqueName: \"kubernetes.io/projected/2bebbe36-46ba-47e9-b53e-2c83abe9c329-kube-api-access-b9r8p\") pod \"frr-k8s-webhook-server-7fcb986d4-58pgc\" (UID: \"2bebbe36-46ba-47e9-b53e-2c83abe9c329\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.237134 master-0 kubenswrapper[29252]: I1203 20:21:54.237102 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9x4m\" (UniqueName: \"kubernetes.io/projected/4977c492-5b52-447d-ab42-4a70601a0da4-kube-api-access-w9x4m\") pod \"frr-k8s-c7bbm\" (UID: \"4977c492-5b52-447d-ab42-4a70601a0da4\") " pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310563 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-metrics-certs\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310627 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-cert\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310656 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/52511992-a397-485f-b709-f81257ee8e16-metallb-excludel2\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310685 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxj2m\" (UniqueName: \"kubernetes.io/projected/1ae49184-91db-4355-b553-8cc5506e80bc-kube-api-access-pxj2m\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310731 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-metrics-certs\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310750 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr4wl\" (UniqueName: \"kubernetes.io/projected/52511992-a397-485f-b709-f81257ee8e16-kube-api-access-tr4wl\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.310802 master-0 kubenswrapper[29252]: I1203 20:21:54.310792 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.311222 master-0 kubenswrapper[29252]: E1203 20:21:54.310915 29252 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 20:21:54.311222 master-0 kubenswrapper[29252]: E1203 20:21:54.310966 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist podName:52511992-a397-485f-b709-f81257ee8e16 nodeName:}" failed. No retries permitted until 2025-12-03 20:21:54.810948348 +0000 UTC m=+749.624493301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist") pod "speaker-j26tj" (UID: "52511992-a397-485f-b709-f81257ee8e16") : secret "metallb-memberlist" not found Dec 03 20:21:54.312178 master-0 kubenswrapper[29252]: I1203 20:21:54.312120 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/52511992-a397-485f-b709-f81257ee8e16-metallb-excludel2\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.317793 master-0 kubenswrapper[29252]: I1203 20:21:54.314594 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-metrics-certs\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.317793 master-0 kubenswrapper[29252]: I1203 20:21:54.315111 29252 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 20:21:54.317793 master-0 kubenswrapper[29252]: I1203 20:21:54.315464 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-metrics-certs\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.333579 master-0 kubenswrapper[29252]: I1203 20:21:54.333536 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr4wl\" (UniqueName: \"kubernetes.io/projected/52511992-a397-485f-b709-f81257ee8e16-kube-api-access-tr4wl\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.334317 master-0 kubenswrapper[29252]: I1203 20:21:54.334280 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ae49184-91db-4355-b553-8cc5506e80bc-cert\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.339631 master-0 kubenswrapper[29252]: I1203 20:21:54.338087 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:21:54.345297 master-0 kubenswrapper[29252]: I1203 20:21:54.341750 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxj2m\" (UniqueName: \"kubernetes.io/projected/1ae49184-91db-4355-b553-8cc5506e80bc-kube-api-access-pxj2m\") pod \"controller-f8648f98b-96nlf\" (UID: \"1ae49184-91db-4355-b553-8cc5506e80bc\") " pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.357790 master-0 kubenswrapper[29252]: I1203 20:21:54.356142 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:21:54.445608 master-0 kubenswrapper[29252]: I1203 20:21:54.445546 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:54.755023 master-0 kubenswrapper[29252]: I1203 20:21:54.753896 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc"] Dec 03 20:21:54.756252 master-0 kubenswrapper[29252]: W1203 20:21:54.756204 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bebbe36_46ba_47e9_b53e_2c83abe9c329.slice/crio-11b4c38adaac73a0a64b0faa5bd097b334c866ff974aa223dfc19057bdb33db3 WatchSource:0}: Error finding container 11b4c38adaac73a0a64b0faa5bd097b334c866ff974aa223dfc19057bdb33db3: Status 404 returned error can't find the container with id 11b4c38adaac73a0a64b0faa5bd097b334c866ff974aa223dfc19057bdb33db3 Dec 03 20:21:54.821857 master-0 kubenswrapper[29252]: I1203 20:21:54.821491 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:54.821857 master-0 kubenswrapper[29252]: E1203 20:21:54.821690 29252 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 20:21:54.821857 master-0 kubenswrapper[29252]: E1203 20:21:54.821811 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist podName:52511992-a397-485f-b709-f81257ee8e16 nodeName:}" failed. No retries permitted until 2025-12-03 20:21:55.821751737 +0000 UTC m=+750.635296700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist") pod "speaker-j26tj" (UID: "52511992-a397-485f-b709-f81257ee8e16") : secret "metallb-memberlist" not found Dec 03 20:21:54.895697 master-0 kubenswrapper[29252]: I1203 20:21:54.895627 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-96nlf"] Dec 03 20:21:54.902032 master-0 kubenswrapper[29252]: W1203 20:21:54.901948 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae49184_91db_4355_b553_8cc5506e80bc.slice/crio-a98538ea84e893efe819214abacac5e3d090416b9af8736b2aa0a5a4f4596772 WatchSource:0}: Error finding container a98538ea84e893efe819214abacac5e3d090416b9af8736b2aa0a5a4f4596772: Status 404 returned error can't find the container with id a98538ea84e893efe819214abacac5e3d090416b9af8736b2aa0a5a4f4596772 Dec 03 20:21:55.344841 master-0 kubenswrapper[29252]: I1203 20:21:55.344640 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"9e72c709cebc4a5a5b1b8a34c872172fcd6248a1c1d5b94e165e8f211c0c7dfd"} Dec 03 20:21:55.346181 master-0 kubenswrapper[29252]: I1203 20:21:55.346140 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" event={"ID":"2bebbe36-46ba-47e9-b53e-2c83abe9c329","Type":"ContainerStarted","Data":"11b4c38adaac73a0a64b0faa5bd097b334c866ff974aa223dfc19057bdb33db3"} Dec 03 20:21:55.348313 master-0 kubenswrapper[29252]: I1203 20:21:55.348239 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-96nlf" event={"ID":"1ae49184-91db-4355-b553-8cc5506e80bc","Type":"ContainerStarted","Data":"d3c9ce695c5e6b7dc6479319f168d3bfa2d477dd1efe8ed6b51d898be9b156f3"} Dec 03 20:21:55.348394 master-0 kubenswrapper[29252]: I1203 20:21:55.348313 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-96nlf" event={"ID":"1ae49184-91db-4355-b553-8cc5506e80bc","Type":"ContainerStarted","Data":"a98538ea84e893efe819214abacac5e3d090416b9af8736b2aa0a5a4f4596772"} Dec 03 20:21:55.845474 master-0 kubenswrapper[29252]: I1203 20:21:55.844478 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:55.847716 master-0 kubenswrapper[29252]: I1203 20:21:55.847657 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/52511992-a397-485f-b709-f81257ee8e16-memberlist\") pod \"speaker-j26tj\" (UID: \"52511992-a397-485f-b709-f81257ee8e16\") " pod="metallb-system/speaker-j26tj" Dec 03 20:21:55.929133 master-0 kubenswrapper[29252]: I1203 20:21:55.929001 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j26tj" Dec 03 20:21:55.976804 master-0 kubenswrapper[29252]: W1203 20:21:55.974261 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52511992_a397_485f_b709_f81257ee8e16.slice/crio-24afcd7b235327521d262ccf12143b60bac0e8805c47447c5642e405b5765c6f WatchSource:0}: Error finding container 24afcd7b235327521d262ccf12143b60bac0e8805c47447c5642e405b5765c6f: Status 404 returned error can't find the container with id 24afcd7b235327521d262ccf12143b60bac0e8805c47447c5642e405b5765c6f Dec 03 20:21:56.346663 master-0 kubenswrapper[29252]: I1203 20:21:56.346592 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb"] Dec 03 20:21:56.350926 master-0 kubenswrapper[29252]: I1203 20:21:56.350881 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" Dec 03 20:21:56.391766 master-0 kubenswrapper[29252]: I1203 20:21:56.391708 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j26tj" event={"ID":"52511992-a397-485f-b709-f81257ee8e16","Type":"ContainerStarted","Data":"f2a8bb4bdc50c665f8295758d0b43097ddb7f1272d243847b74ef7a2c1ca0cf9"} Dec 03 20:21:56.391766 master-0 kubenswrapper[29252]: I1203 20:21:56.391768 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j26tj" event={"ID":"52511992-a397-485f-b709-f81257ee8e16","Type":"ContainerStarted","Data":"24afcd7b235327521d262ccf12143b60bac0e8805c47447c5642e405b5765c6f"} Dec 03 20:21:56.396894 master-0 kubenswrapper[29252]: I1203 20:21:56.396857 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d"] Dec 03 20:21:56.398307 master-0 kubenswrapper[29252]: I1203 20:21:56.398275 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.404546 master-0 kubenswrapper[29252]: I1203 20:21:56.404459 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 20:21:56.408699 master-0 kubenswrapper[29252]: I1203 20:21:56.408615 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb"] Dec 03 20:21:56.432764 master-0 kubenswrapper[29252]: I1203 20:21:56.432335 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zfhdt"] Dec 03 20:21:56.433584 master-0 kubenswrapper[29252]: I1203 20:21:56.433560 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.438349 master-0 kubenswrapper[29252]: I1203 20:21:56.438307 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d"] Dec 03 20:21:56.453810 master-0 kubenswrapper[29252]: I1203 20:21:56.453694 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st7xz\" (UniqueName: \"kubernetes.io/projected/6980a7eb-a3c8-4496-87aa-b56680009c84-kube-api-access-st7xz\") pod \"nmstate-metrics-7f946cbc9-xmxkb\" (UID: \"6980a7eb-a3c8-4496-87aa-b56680009c84\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" Dec 03 20:21:56.491117 master-0 kubenswrapper[29252]: I1203 20:21:56.491048 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22"] Dec 03 20:21:56.491994 master-0 kubenswrapper[29252]: I1203 20:21:56.491969 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.495540 master-0 kubenswrapper[29252]: I1203 20:21:56.495493 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 20:21:56.495632 master-0 kubenswrapper[29252]: I1203 20:21:56.495533 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 20:21:56.514017 master-0 kubenswrapper[29252]: I1203 20:21:56.511880 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22"] Dec 03 20:21:56.555638 master-0 kubenswrapper[29252]: I1203 20:21:56.555554 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-nmstate-lock\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.555638 master-0 kubenswrapper[29252]: I1203 20:21:56.555607 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st7xz\" (UniqueName: \"kubernetes.io/projected/6980a7eb-a3c8-4496-87aa-b56680009c84-kube-api-access-st7xz\") pod \"nmstate-metrics-7f946cbc9-xmxkb\" (UID: \"6980a7eb-a3c8-4496-87aa-b56680009c84\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" Dec 03 20:21:56.555638 master-0 kubenswrapper[29252]: I1203 20:21:56.555630 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-dbus-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.555638 master-0 kubenswrapper[29252]: I1203 20:21:56.555654 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4xd\" (UniqueName: \"kubernetes.io/projected/0944c190-d25f-481e-b59a-75869f8dc9e2-kube-api-access-ch4xd\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.556272 master-0 kubenswrapper[29252]: I1203 20:21:56.555691 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjk2x\" (UniqueName: \"kubernetes.io/projected/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-kube-api-access-gjk2x\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.556272 master-0 kubenswrapper[29252]: I1203 20:21:56.555733 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-ovs-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.556272 master-0 kubenswrapper[29252]: I1203 20:21:56.555767 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.575492 master-0 kubenswrapper[29252]: I1203 20:21:56.575442 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st7xz\" (UniqueName: \"kubernetes.io/projected/6980a7eb-a3c8-4496-87aa-b56680009c84-kube-api-access-st7xz\") pod \"nmstate-metrics-7f946cbc9-xmxkb\" (UID: \"6980a7eb-a3c8-4496-87aa-b56680009c84\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" Dec 03 20:21:56.657368 master-0 kubenswrapper[29252]: I1203 20:21:56.657221 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7010359f-6e7a-41c9-9a49-f29d67babf3c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.657368 master-0 kubenswrapper[29252]: I1203 20:21:56.657298 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjk2x\" (UniqueName: \"kubernetes.io/projected/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-kube-api-access-gjk2x\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.657368 master-0 kubenswrapper[29252]: I1203 20:21:56.657368 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-ovs-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.657632 master-0 kubenswrapper[29252]: I1203 20:21:56.657414 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwtm9\" (UniqueName: \"kubernetes.io/projected/7010359f-6e7a-41c9-9a49-f29d67babf3c-kube-api-access-kwtm9\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.657632 master-0 kubenswrapper[29252]: I1203 20:21:56.657460 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.657632 master-0 kubenswrapper[29252]: I1203 20:21:56.657540 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-ovs-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.657632 master-0 kubenswrapper[29252]: I1203 20:21:56.657545 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7010359f-6e7a-41c9-9a49-f29d67babf3c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.657759 master-0 kubenswrapper[29252]: I1203 20:21:56.657661 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-nmstate-lock\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.657759 master-0 kubenswrapper[29252]: I1203 20:21:56.657730 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-dbus-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.657864 master-0 kubenswrapper[29252]: I1203 20:21:56.657812 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4xd\" (UniqueName: \"kubernetes.io/projected/0944c190-d25f-481e-b59a-75869f8dc9e2-kube-api-access-ch4xd\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.659520 master-0 kubenswrapper[29252]: I1203 20:21:56.658069 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-dbus-socket\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.659520 master-0 kubenswrapper[29252]: I1203 20:21:56.658409 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0944c190-d25f-481e-b59a-75869f8dc9e2-nmstate-lock\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.670941 master-0 kubenswrapper[29252]: I1203 20:21:56.670894 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.680467 master-0 kubenswrapper[29252]: I1203 20:21:56.680421 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjk2x\" (UniqueName: \"kubernetes.io/projected/cabd9912-85fb-4fca-a116-1c9bf1ab19e1-kube-api-access-gjk2x\") pod \"nmstate-webhook-5f6d4c5ccb-9cg4d\" (UID: \"cabd9912-85fb-4fca-a116-1c9bf1ab19e1\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.684417 master-0 kubenswrapper[29252]: I1203 20:21:56.684387 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4xd\" (UniqueName: \"kubernetes.io/projected/0944c190-d25f-481e-b59a-75869f8dc9e2-kube-api-access-ch4xd\") pod \"nmstate-handler-zfhdt\" (UID: \"0944c190-d25f-481e-b59a-75869f8dc9e2\") " pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.697017 master-0 kubenswrapper[29252]: I1203 20:21:56.696624 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" Dec 03 20:21:56.727446 master-0 kubenswrapper[29252]: I1203 20:21:56.727390 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:21:56.758219 master-0 kubenswrapper[29252]: I1203 20:21:56.758141 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:21:56.759330 master-0 kubenswrapper[29252]: I1203 20:21:56.759233 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwtm9\" (UniqueName: \"kubernetes.io/projected/7010359f-6e7a-41c9-9a49-f29d67babf3c-kube-api-access-kwtm9\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.759435 master-0 kubenswrapper[29252]: I1203 20:21:56.759346 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7010359f-6e7a-41c9-9a49-f29d67babf3c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.759435 master-0 kubenswrapper[29252]: I1203 20:21:56.759418 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7010359f-6e7a-41c9-9a49-f29d67babf3c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.760157 master-0 kubenswrapper[29252]: I1203 20:21:56.760130 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7010359f-6e7a-41c9-9a49-f29d67babf3c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:56.763419 master-0 kubenswrapper[29252]: I1203 20:21:56.763391 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7010359f-6e7a-41c9-9a49-f29d67babf3c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:57.349285 master-0 kubenswrapper[29252]: I1203 20:21:57.343519 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7ffc94f8bc-2s94b"] Dec 03 20:21:57.349285 master-0 kubenswrapper[29252]: I1203 20:21:57.345079 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.369636 master-0 kubenswrapper[29252]: I1203 20:21:57.366648 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ffc94f8bc-2s94b"] Dec 03 20:21:57.369636 master-0 kubenswrapper[29252]: I1203 20:21:57.366721 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwtm9\" (UniqueName: \"kubernetes.io/projected/7010359f-6e7a-41c9-9a49-f29d67babf3c-kube-api-access-kwtm9\") pod \"nmstate-console-plugin-7fbb5f6569-9nn22\" (UID: \"7010359f-6e7a-41c9-9a49-f29d67babf3c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:57.386190 master-0 kubenswrapper[29252]: I1203 20:21:57.386156 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb"] Dec 03 20:21:57.410967 master-0 kubenswrapper[29252]: I1203 20:21:57.410884 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d"] Dec 03 20:21:57.411594 master-0 kubenswrapper[29252]: W1203 20:21:57.411553 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6980a7eb_a3c8_4496_87aa_b56680009c84.slice/crio-f48b9199933ab2d5099143ad0887cacdf1c4c07eed287c23416e4a88ef9ccd46 WatchSource:0}: Error finding container f48b9199933ab2d5099143ad0887cacdf1c4c07eed287c23416e4a88ef9ccd46: Status 404 returned error can't find the container with id f48b9199933ab2d5099143ad0887cacdf1c4c07eed287c23416e4a88ef9ccd46 Dec 03 20:21:57.436995 master-0 kubenswrapper[29252]: I1203 20:21:57.431301 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" Dec 03 20:21:57.436995 master-0 kubenswrapper[29252]: W1203 20:21:57.432804 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcabd9912_85fb_4fca_a116_1c9bf1ab19e1.slice/crio-cc81bb46b5b3d906640608fd56a5cc864ac40d53439ca8fc64a4de1fa90e3fa1 WatchSource:0}: Error finding container cc81bb46b5b3d906640608fd56a5cc864ac40d53439ca8fc64a4de1fa90e3fa1: Status 404 returned error can't find the container with id cc81bb46b5b3d906640608fd56a5cc864ac40d53439ca8fc64a4de1fa90e3fa1 Dec 03 20:21:57.451960 master-0 kubenswrapper[29252]: I1203 20:21:57.451885 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zfhdt" event={"ID":"0944c190-d25f-481e-b59a-75869f8dc9e2","Type":"ContainerStarted","Data":"fdffbe0ff617df6d9e0b3ea0dc8ecfe0dd06c71a5da4e7ad1b2a26c3c952fb03"} Dec 03 20:21:57.479398 master-0 kubenswrapper[29252]: I1203 20:21:57.479358 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-console-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.479561 master-0 kubenswrapper[29252]: I1203 20:21:57.479433 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-oauth-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.479877 master-0 kubenswrapper[29252]: I1203 20:21:57.479859 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bk7\" (UniqueName: \"kubernetes.io/projected/f570066d-b49f-40fa-b901-9a89f265d1b1-kube-api-access-97bk7\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.479931 master-0 kubenswrapper[29252]: I1203 20:21:57.479921 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-oauth-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.479981 master-0 kubenswrapper[29252]: I1203 20:21:57.479940 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.480081 master-0 kubenswrapper[29252]: I1203 20:21:57.480006 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-trusted-ca-bundle\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.480138 master-0 kubenswrapper[29252]: I1203 20:21:57.480106 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-service-ca\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.583748 master-0 kubenswrapper[29252]: I1203 20:21:57.583593 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-oauth-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.583748 master-0 kubenswrapper[29252]: I1203 20:21:57.583668 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.583748 master-0 kubenswrapper[29252]: I1203 20:21:57.583712 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-trusted-ca-bundle\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.584260 master-0 kubenswrapper[29252]: I1203 20:21:57.583795 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-service-ca\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.584260 master-0 kubenswrapper[29252]: I1203 20:21:57.583933 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-console-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.584260 master-0 kubenswrapper[29252]: I1203 20:21:57.583954 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-oauth-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.584260 master-0 kubenswrapper[29252]: I1203 20:21:57.584007 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bk7\" (UniqueName: \"kubernetes.io/projected/f570066d-b49f-40fa-b901-9a89f265d1b1-kube-api-access-97bk7\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.585570 master-0 kubenswrapper[29252]: I1203 20:21:57.585516 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-oauth-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.585570 master-0 kubenswrapper[29252]: I1203 20:21:57.585528 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-service-ca\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.586160 master-0 kubenswrapper[29252]: I1203 20:21:57.586130 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-console-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.589853 master-0 kubenswrapper[29252]: I1203 20:21:57.589722 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f570066d-b49f-40fa-b901-9a89f265d1b1-trusted-ca-bundle\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.591520 master-0 kubenswrapper[29252]: I1203 20:21:57.591387 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-oauth-config\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.592099 master-0 kubenswrapper[29252]: I1203 20:21:57.592019 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f570066d-b49f-40fa-b901-9a89f265d1b1-console-serving-cert\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.600331 master-0 kubenswrapper[29252]: I1203 20:21:57.600056 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bk7\" (UniqueName: \"kubernetes.io/projected/f570066d-b49f-40fa-b901-9a89f265d1b1-kube-api-access-97bk7\") pod \"console-7ffc94f8bc-2s94b\" (UID: \"f570066d-b49f-40fa-b901-9a89f265d1b1\") " pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:57.688620 master-0 kubenswrapper[29252]: I1203 20:21:57.688022 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:21:58.001346 master-0 kubenswrapper[29252]: I1203 20:21:58.001284 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22"] Dec 03 20:21:58.031123 master-0 kubenswrapper[29252]: W1203 20:21:58.031065 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7010359f_6e7a_41c9_9a49_f29d67babf3c.slice/crio-757d94f804f964cd7775508d6c9e20809a4a603b297eb0fe46d2fea3e4a371b1 WatchSource:0}: Error finding container 757d94f804f964cd7775508d6c9e20809a4a603b297eb0fe46d2fea3e4a371b1: Status 404 returned error can't find the container with id 757d94f804f964cd7775508d6c9e20809a4a603b297eb0fe46d2fea3e4a371b1 Dec 03 20:21:58.449688 master-0 kubenswrapper[29252]: I1203 20:21:58.449620 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" event={"ID":"cabd9912-85fb-4fca-a116-1c9bf1ab19e1","Type":"ContainerStarted","Data":"cc81bb46b5b3d906640608fd56a5cc864ac40d53439ca8fc64a4de1fa90e3fa1"} Dec 03 20:21:58.451322 master-0 kubenswrapper[29252]: I1203 20:21:58.451286 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" event={"ID":"6980a7eb-a3c8-4496-87aa-b56680009c84","Type":"ContainerStarted","Data":"f48b9199933ab2d5099143ad0887cacdf1c4c07eed287c23416e4a88ef9ccd46"} Dec 03 20:21:58.452302 master-0 kubenswrapper[29252]: I1203 20:21:58.452267 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" event={"ID":"7010359f-6e7a-41c9-9a49-f29d67babf3c","Type":"ContainerStarted","Data":"757d94f804f964cd7775508d6c9e20809a4a603b297eb0fe46d2fea3e4a371b1"} Dec 03 20:21:58.453886 master-0 kubenswrapper[29252]: I1203 20:21:58.453807 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-96nlf" event={"ID":"1ae49184-91db-4355-b553-8cc5506e80bc","Type":"ContainerStarted","Data":"1fdb54b27229e11e5a7f9829406a805eb8ea1990344dce1003c223e7fe9134bb"} Dec 03 20:21:58.454394 master-0 kubenswrapper[29252]: I1203 20:21:58.454296 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:21:58.618298 master-0 kubenswrapper[29252]: I1203 20:21:58.618201 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-96nlf" podStartSLOduration=1.592138512 podStartE2EDuration="4.61818047s" podCreationTimestamp="2025-12-03 20:21:54 +0000 UTC" firstStartedPulling="2025-12-03 20:21:55.071302678 +0000 UTC m=+749.884847671" lastFinishedPulling="2025-12-03 20:21:58.097344676 +0000 UTC m=+752.910889629" observedRunningTime="2025-12-03 20:21:58.612169343 +0000 UTC m=+753.425714296" watchObservedRunningTime="2025-12-03 20:21:58.61818047 +0000 UTC m=+753.431725433" Dec 03 20:21:58.657084 master-0 kubenswrapper[29252]: I1203 20:21:58.657025 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7ffc94f8bc-2s94b"] Dec 03 20:21:58.659574 master-0 kubenswrapper[29252]: W1203 20:21:58.659523 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf570066d_b49f_40fa_b901_9a89f265d1b1.slice/crio-e3ae3e39bb5725207c1d53376e690bd41da3395465b02851af90051ab2110a41 WatchSource:0}: Error finding container e3ae3e39bb5725207c1d53376e690bd41da3395465b02851af90051ab2110a41: Status 404 returned error can't find the container with id e3ae3e39bb5725207c1d53376e690bd41da3395465b02851af90051ab2110a41 Dec 03 20:21:59.473215 master-0 kubenswrapper[29252]: I1203 20:21:59.473127 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ffc94f8bc-2s94b" event={"ID":"f570066d-b49f-40fa-b901-9a89f265d1b1","Type":"ContainerStarted","Data":"e3ae3e39bb5725207c1d53376e690bd41da3395465b02851af90051ab2110a41"} Dec 03 20:22:00.488162 master-0 kubenswrapper[29252]: I1203 20:22:00.488113 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7ffc94f8bc-2s94b" event={"ID":"f570066d-b49f-40fa-b901-9a89f265d1b1","Type":"ContainerStarted","Data":"fd26befc56f04b367fc696e35b106b217bf27c296262d2d36b89d47d267a7ca5"} Dec 03 20:22:00.559765 master-0 kubenswrapper[29252]: I1203 20:22:00.559628 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7ffc94f8bc-2s94b" podStartSLOduration=4.559609331 podStartE2EDuration="4.559609331s" podCreationTimestamp="2025-12-03 20:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:22:00.553535283 +0000 UTC m=+755.367080256" watchObservedRunningTime="2025-12-03 20:22:00.559609331 +0000 UTC m=+755.373154284" Dec 03 20:22:04.529590 master-0 kubenswrapper[29252]: I1203 20:22:04.529032 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" event={"ID":"2bebbe36-46ba-47e9-b53e-2c83abe9c329","Type":"ContainerStarted","Data":"cf2c7c06fac1ea8e5fecfa453df55b17b59a75e7b78c26d8099643c27793f75c"} Dec 03 20:22:04.529590 master-0 kubenswrapper[29252]: I1203 20:22:04.529094 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:22:04.532285 master-0 kubenswrapper[29252]: I1203 20:22:04.532234 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" event={"ID":"cabd9912-85fb-4fca-a116-1c9bf1ab19e1","Type":"ContainerStarted","Data":"93d641d2bb94aab4e15ac018c99b00c9217603e1a6aa2c5ea7ddda7541a637c3"} Dec 03 20:22:04.532389 master-0 kubenswrapper[29252]: I1203 20:22:04.532295 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:22:04.534901 master-0 kubenswrapper[29252]: I1203 20:22:04.534866 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" event={"ID":"6980a7eb-a3c8-4496-87aa-b56680009c84","Type":"ContainerStarted","Data":"52e957b62e8c6ad72398c0351fb68d5390694c35e8f22ca21c9d9f0f4399e94a"} Dec 03 20:22:04.534901 master-0 kubenswrapper[29252]: I1203 20:22:04.534899 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" event={"ID":"6980a7eb-a3c8-4496-87aa-b56680009c84","Type":"ContainerStarted","Data":"b126fb6b3c1ba468b5d3a6ef9037a25388de27d530b44f887c6d040d8a3450f9"} Dec 03 20:22:04.537004 master-0 kubenswrapper[29252]: I1203 20:22:04.536972 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zfhdt" event={"ID":"0944c190-d25f-481e-b59a-75869f8dc9e2","Type":"ContainerStarted","Data":"766aaa099e53519abcf31889425808f2c4df3654d6706bf0b515f43ee9eb9f92"} Dec 03 20:22:04.537257 master-0 kubenswrapper[29252]: I1203 20:22:04.537231 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:22:04.538876 master-0 kubenswrapper[29252]: I1203 20:22:04.538848 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" event={"ID":"7010359f-6e7a-41c9-9a49-f29d67babf3c","Type":"ContainerStarted","Data":"12ff1c78a8e51605afc06b4c2719ceb1a214dfc357deeed14abe6846b4f00246"} Dec 03 20:22:04.541198 master-0 kubenswrapper[29252]: I1203 20:22:04.541168 29252 generic.go:334] "Generic (PLEG): container finished" podID="4977c492-5b52-447d-ab42-4a70601a0da4" containerID="e687ac6a23fa50cd09fc1e18d495e41969cdacd6acd0850fdac324d40b01b192" exitCode=0 Dec 03 20:22:04.541279 master-0 kubenswrapper[29252]: I1203 20:22:04.541225 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerDied","Data":"e687ac6a23fa50cd09fc1e18d495e41969cdacd6acd0850fdac324d40b01b192"} Dec 03 20:22:04.547514 master-0 kubenswrapper[29252]: I1203 20:22:04.547422 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j26tj" event={"ID":"52511992-a397-485f-b709-f81257ee8e16","Type":"ContainerStarted","Data":"5e2bcad815837e9f4ab7469c59cfc2b9133a52241529249c1e6639f5ac288ce6"} Dec 03 20:22:04.547629 master-0 kubenswrapper[29252]: I1203 20:22:04.547584 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-j26tj" Dec 03 20:22:04.564314 master-0 kubenswrapper[29252]: I1203 20:22:04.563383 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" podStartSLOduration=3.150361718 podStartE2EDuration="11.563325744s" podCreationTimestamp="2025-12-03 20:21:53 +0000 UTC" firstStartedPulling="2025-12-03 20:21:54.759142388 +0000 UTC m=+749.572687331" lastFinishedPulling="2025-12-03 20:22:03.172106394 +0000 UTC m=+757.985651357" observedRunningTime="2025-12-03 20:22:04.554060699 +0000 UTC m=+759.367605662" watchObservedRunningTime="2025-12-03 20:22:04.563325744 +0000 UTC m=+759.376870717" Dec 03 20:22:04.597585 master-0 kubenswrapper[29252]: I1203 20:22:04.595442 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-9nn22" podStartSLOduration=3.384574419 podStartE2EDuration="8.595412458s" podCreationTimestamp="2025-12-03 20:21:56 +0000 UTC" firstStartedPulling="2025-12-03 20:21:58.037719601 +0000 UTC m=+752.851264554" lastFinishedPulling="2025-12-03 20:22:03.24855763 +0000 UTC m=+758.062102593" observedRunningTime="2025-12-03 20:22:04.59097756 +0000 UTC m=+759.404522553" watchObservedRunningTime="2025-12-03 20:22:04.595412458 +0000 UTC m=+759.408957451" Dec 03 20:22:04.629493 master-0 kubenswrapper[29252]: I1203 20:22:04.627038 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zfhdt" podStartSLOduration=2.237748184 podStartE2EDuration="8.62701729s" podCreationTimestamp="2025-12-03 20:21:56 +0000 UTC" firstStartedPulling="2025-12-03 20:21:56.786067627 +0000 UTC m=+751.599612580" lastFinishedPulling="2025-12-03 20:22:03.175336733 +0000 UTC m=+757.988881686" observedRunningTime="2025-12-03 20:22:04.625630216 +0000 UTC m=+759.439175179" watchObservedRunningTime="2025-12-03 20:22:04.62701729 +0000 UTC m=+759.440562243" Dec 03 20:22:04.691764 master-0 kubenswrapper[29252]: I1203 20:22:04.691669 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-j26tj" podStartSLOduration=4.288554474 podStartE2EDuration="10.691648047s" podCreationTimestamp="2025-12-03 20:21:54 +0000 UTC" firstStartedPulling="2025-12-03 20:21:56.283367886 +0000 UTC m=+751.096912839" lastFinishedPulling="2025-12-03 20:22:02.686461459 +0000 UTC m=+757.500006412" observedRunningTime="2025-12-03 20:22:04.684528484 +0000 UTC m=+759.498073437" watchObservedRunningTime="2025-12-03 20:22:04.691648047 +0000 UTC m=+759.505193000" Dec 03 20:22:04.706171 master-0 kubenswrapper[29252]: I1203 20:22:04.705763 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" podStartSLOduration=2.9139875 podStartE2EDuration="8.705739151s" podCreationTimestamp="2025-12-03 20:21:56 +0000 UTC" firstStartedPulling="2025-12-03 20:21:57.454805281 +0000 UTC m=+752.268350234" lastFinishedPulling="2025-12-03 20:22:03.246556932 +0000 UTC m=+758.060101885" observedRunningTime="2025-12-03 20:22:04.702966613 +0000 UTC m=+759.516511596" watchObservedRunningTime="2025-12-03 20:22:04.705739151 +0000 UTC m=+759.519284114" Dec 03 20:22:04.761832 master-0 kubenswrapper[29252]: I1203 20:22:04.758514 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-xmxkb" podStartSLOduration=3.011387528 podStartE2EDuration="8.758497179s" podCreationTimestamp="2025-12-03 20:21:56 +0000 UTC" firstStartedPulling="2025-12-03 20:21:57.425109146 +0000 UTC m=+752.238654109" lastFinishedPulling="2025-12-03 20:22:03.172218807 +0000 UTC m=+757.985763760" observedRunningTime="2025-12-03 20:22:04.730000093 +0000 UTC m=+759.543545076" watchObservedRunningTime="2025-12-03 20:22:04.758497179 +0000 UTC m=+759.572042132" Dec 03 20:22:05.563213 master-0 kubenswrapper[29252]: I1203 20:22:05.563116 29252 generic.go:334] "Generic (PLEG): container finished" podID="4977c492-5b52-447d-ab42-4a70601a0da4" containerID="ae9e5dc7acac99fd3e4d7bd4e537bf9539869f18b049a2dedf3c75620677c79a" exitCode=0 Dec 03 20:22:05.564371 master-0 kubenswrapper[29252]: I1203 20:22:05.563243 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerDied","Data":"ae9e5dc7acac99fd3e4d7bd4e537bf9539869f18b049a2dedf3c75620677c79a"} Dec 03 20:22:06.584742 master-0 kubenswrapper[29252]: I1203 20:22:06.584528 29252 generic.go:334] "Generic (PLEG): container finished" podID="4977c492-5b52-447d-ab42-4a70601a0da4" containerID="0953b2cf35a1052d31a24f096552f41af3a5474f9e5a8294c6fc77a39b29f372" exitCode=0 Dec 03 20:22:06.584742 master-0 kubenswrapper[29252]: I1203 20:22:06.584622 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerDied","Data":"0953b2cf35a1052d31a24f096552f41af3a5474f9e5a8294c6fc77a39b29f372"} Dec 03 20:22:07.604177 master-0 kubenswrapper[29252]: I1203 20:22:07.603966 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"749144107c6ce2ed3b2699afd4b94bc89308a396d02a61c5035089c680406be6"} Dec 03 20:22:07.604177 master-0 kubenswrapper[29252]: I1203 20:22:07.604174 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"b8fd565321ef63218df5ac2b96c33ba16fb285fa0054493dac4bf31bac507131"} Dec 03 20:22:07.604177 master-0 kubenswrapper[29252]: I1203 20:22:07.604185 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"d2f91350471cbbfa395bcdc596545729f939b60034029a584e8b3c63030b6998"} Dec 03 20:22:07.604770 master-0 kubenswrapper[29252]: I1203 20:22:07.604197 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"d8eb33253db00489467d2af92178b2215f78884f63eec83d04403717262355d2"} Dec 03 20:22:07.688893 master-0 kubenswrapper[29252]: I1203 20:22:07.688825 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:22:07.688893 master-0 kubenswrapper[29252]: I1203 20:22:07.688890 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:22:07.692815 master-0 kubenswrapper[29252]: I1203 20:22:07.692733 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:22:08.621060 master-0 kubenswrapper[29252]: I1203 20:22:08.620938 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"bd312ae4f7a7030300804b234a47b20f05601baed4d41c6c5a82fb89c390f15d"} Dec 03 20:22:08.622433 master-0 kubenswrapper[29252]: I1203 20:22:08.621082 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-c7bbm" event={"ID":"4977c492-5b52-447d-ab42-4a70601a0da4","Type":"ContainerStarted","Data":"87bcf7f6c450acf6264b25c27bbbc0cb93e37aa0c721cd743b65b89011224c97"} Dec 03 20:22:08.622433 master-0 kubenswrapper[29252]: I1203 20:22:08.621471 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:22:08.626500 master-0 kubenswrapper[29252]: I1203 20:22:08.626440 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7ffc94f8bc-2s94b" Dec 03 20:22:08.651398 master-0 kubenswrapper[29252]: I1203 20:22:08.651217 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-c7bbm" podStartSLOduration=6.990869028 podStartE2EDuration="15.651197112s" podCreationTimestamp="2025-12-03 20:21:53 +0000 UTC" firstStartedPulling="2025-12-03 20:21:54.513976494 +0000 UTC m=+749.327521447" lastFinishedPulling="2025-12-03 20:22:03.174304558 +0000 UTC m=+757.987849531" observedRunningTime="2025-12-03 20:22:08.649012999 +0000 UTC m=+763.462557972" watchObservedRunningTime="2025-12-03 20:22:08.651197112 +0000 UTC m=+763.464742065" Dec 03 20:22:08.717661 master-0 kubenswrapper[29252]: I1203 20:22:08.715366 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:22:09.357024 master-0 kubenswrapper[29252]: I1203 20:22:09.356964 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:22:09.429517 master-0 kubenswrapper[29252]: I1203 20:22:09.429460 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:22:11.799478 master-0 kubenswrapper[29252]: I1203 20:22:11.799421 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zfhdt" Dec 03 20:22:14.347660 master-0 kubenswrapper[29252]: I1203 20:22:14.347564 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-58pgc" Dec 03 20:22:14.450637 master-0 kubenswrapper[29252]: I1203 20:22:14.450572 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-96nlf" Dec 03 20:22:15.942310 master-0 kubenswrapper[29252]: I1203 20:22:15.942238 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-j26tj" Dec 03 20:22:16.733898 master-0 kubenswrapper[29252]: I1203 20:22:16.733798 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9cg4d" Dec 03 20:22:23.407305 master-0 kubenswrapper[29252]: I1203 20:22:23.407238 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-k68ws"] Dec 03 20:22:23.409627 master-0 kubenswrapper[29252]: I1203 20:22:23.409582 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.413053 master-0 kubenswrapper[29252]: I1203 20:22:23.411868 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Dec 03 20:22:23.434613 master-0 kubenswrapper[29252]: I1203 20:22:23.431582 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-k68ws"] Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494058 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-run-udev\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494124 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-csi-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494387 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-file-lock-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494493 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-node-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494541 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-device-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494563 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-pod-volumes-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494800 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8px\" (UniqueName: \"kubernetes.io/projected/ad601dfa-8310-41a9-8abd-119fbed1aa01-kube-api-access-dr8px\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494931 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-sys\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494964 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad601dfa-8310-41a9-8abd-119fbed1aa01-metrics-cert\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.494988 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-registration-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.495003 master-0 kubenswrapper[29252]: I1203 20:22:23.495025 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-lvmd-config\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597119 master-0 kubenswrapper[29252]: I1203 20:22:23.597043 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-file-lock-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597130 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-node-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597161 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-device-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597185 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-pod-volumes-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597227 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8px\" (UniqueName: \"kubernetes.io/projected/ad601dfa-8310-41a9-8abd-119fbed1aa01-kube-api-access-dr8px\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597257 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-sys\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597280 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad601dfa-8310-41a9-8abd-119fbed1aa01-metrics-cert\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597301 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-registration-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597363 master-0 kubenswrapper[29252]: I1203 20:22:23.597329 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-lvmd-config\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597708 master-0 kubenswrapper[29252]: I1203 20:22:23.597385 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-run-udev\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597708 master-0 kubenswrapper[29252]: I1203 20:22:23.597415 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-csi-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597708 master-0 kubenswrapper[29252]: I1203 20:22:23.597466 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-file-lock-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597708 master-0 kubenswrapper[29252]: I1203 20:22:23.597557 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-sys\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597911 master-0 kubenswrapper[29252]: I1203 20:22:23.597751 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-csi-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597911 master-0 kubenswrapper[29252]: I1203 20:22:23.597891 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-node-plugin-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.597999 master-0 kubenswrapper[29252]: I1203 20:22:23.597952 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-device-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.598072 master-0 kubenswrapper[29252]: I1203 20:22:23.597999 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-pod-volumes-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.598454 master-0 kubenswrapper[29252]: I1203 20:22:23.598376 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-registration-dir\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.598454 master-0 kubenswrapper[29252]: I1203 20:22:23.598423 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-run-udev\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.598583 master-0 kubenswrapper[29252]: I1203 20:22:23.598524 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ad601dfa-8310-41a9-8abd-119fbed1aa01-lvmd-config\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.602081 master-0 kubenswrapper[29252]: I1203 20:22:23.601040 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad601dfa-8310-41a9-8abd-119fbed1aa01-metrics-cert\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.625590 master-0 kubenswrapper[29252]: I1203 20:22:23.625540 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8px\" (UniqueName: \"kubernetes.io/projected/ad601dfa-8310-41a9-8abd-119fbed1aa01-kube-api-access-dr8px\") pod \"vg-manager-k68ws\" (UID: \"ad601dfa-8310-41a9-8abd-119fbed1aa01\") " pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:23.738372 master-0 kubenswrapper[29252]: I1203 20:22:23.738245 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:24.216618 master-0 kubenswrapper[29252]: W1203 20:22:24.216522 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad601dfa_8310_41a9_8abd_119fbed1aa01.slice/crio-ed8718fb13964beeaa5f313a8bf7cbffefd33d08b9b19fb21088724445a251a4 WatchSource:0}: Error finding container ed8718fb13964beeaa5f313a8bf7cbffefd33d08b9b19fb21088724445a251a4: Status 404 returned error can't find the container with id ed8718fb13964beeaa5f313a8bf7cbffefd33d08b9b19fb21088724445a251a4 Dec 03 20:22:24.218235 master-0 kubenswrapper[29252]: I1203 20:22:24.218155 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-k68ws"] Dec 03 20:22:24.359799 master-0 kubenswrapper[29252]: I1203 20:22:24.359373 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-c7bbm" Dec 03 20:22:24.783175 master-0 kubenswrapper[29252]: I1203 20:22:24.783121 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-k68ws" event={"ID":"ad601dfa-8310-41a9-8abd-119fbed1aa01","Type":"ContainerStarted","Data":"e1ee433266c7f4e98b2238296bd055a52dd3d7642384d08805ef68d98a5eecf7"} Dec 03 20:22:24.783175 master-0 kubenswrapper[29252]: I1203 20:22:24.783176 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-k68ws" event={"ID":"ad601dfa-8310-41a9-8abd-119fbed1aa01","Type":"ContainerStarted","Data":"ed8718fb13964beeaa5f313a8bf7cbffefd33d08b9b19fb21088724445a251a4"} Dec 03 20:22:24.816165 master-0 kubenswrapper[29252]: I1203 20:22:24.816038 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-k68ws" podStartSLOduration=1.8160158050000001 podStartE2EDuration="1.816015805s" podCreationTimestamp="2025-12-03 20:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:22:24.809526276 +0000 UTC m=+779.623071239" watchObservedRunningTime="2025-12-03 20:22:24.816015805 +0000 UTC m=+779.629560768" Dec 03 20:22:26.803471 master-0 kubenswrapper[29252]: I1203 20:22:26.803403 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-k68ws_ad601dfa-8310-41a9-8abd-119fbed1aa01/vg-manager/0.log" Dec 03 20:22:26.804107 master-0 kubenswrapper[29252]: I1203 20:22:26.803487 29252 generic.go:334] "Generic (PLEG): container finished" podID="ad601dfa-8310-41a9-8abd-119fbed1aa01" containerID="e1ee433266c7f4e98b2238296bd055a52dd3d7642384d08805ef68d98a5eecf7" exitCode=1 Dec 03 20:22:26.804107 master-0 kubenswrapper[29252]: I1203 20:22:26.803529 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-k68ws" event={"ID":"ad601dfa-8310-41a9-8abd-119fbed1aa01","Type":"ContainerDied","Data":"e1ee433266c7f4e98b2238296bd055a52dd3d7642384d08805ef68d98a5eecf7"} Dec 03 20:22:26.804213 master-0 kubenswrapper[29252]: I1203 20:22:26.804195 29252 scope.go:117] "RemoveContainer" containerID="e1ee433266c7f4e98b2238296bd055a52dd3d7642384d08805ef68d98a5eecf7" Dec 03 20:22:27.168336 master-0 kubenswrapper[29252]: I1203 20:22:27.168255 29252 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Dec 03 20:22:27.370647 master-0 kubenswrapper[29252]: I1203 20:22:27.370457 29252 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-12-03T20:22:27.168305865Z","Handler":null,"Name":""} Dec 03 20:22:27.373026 master-0 kubenswrapper[29252]: I1203 20:22:27.372980 29252 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Dec 03 20:22:27.373026 master-0 kubenswrapper[29252]: I1203 20:22:27.373028 29252 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Dec 03 20:22:27.814088 master-0 kubenswrapper[29252]: I1203 20:22:27.814029 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-k68ws_ad601dfa-8310-41a9-8abd-119fbed1aa01/vg-manager/0.log" Dec 03 20:22:27.814614 master-0 kubenswrapper[29252]: I1203 20:22:27.814121 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-k68ws" event={"ID":"ad601dfa-8310-41a9-8abd-119fbed1aa01","Type":"ContainerStarted","Data":"cba62241aca44a0d64da451c150f255243c7b5071ebe0d51efd85c97f61912dc"} Dec 03 20:22:29.919172 master-0 kubenswrapper[29252]: I1203 20:22:29.919108 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4sgv6"] Dec 03 20:22:29.920058 master-0 kubenswrapper[29252]: I1203 20:22:29.920034 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:29.922080 master-0 kubenswrapper[29252]: I1203 20:22:29.922031 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 20:22:29.922277 master-0 kubenswrapper[29252]: I1203 20:22:29.922251 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 20:22:29.934614 master-0 kubenswrapper[29252]: I1203 20:22:29.934393 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4sgv6"] Dec 03 20:22:29.976930 master-0 kubenswrapper[29252]: I1203 20:22:29.970799 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcpd\" (UniqueName: \"kubernetes.io/projected/bf8c7606-3354-4927-931d-a0ca4721acd6-kube-api-access-xhcpd\") pod \"openstack-operator-index-4sgv6\" (UID: \"bf8c7606-3354-4927-931d-a0ca4721acd6\") " pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:30.072676 master-0 kubenswrapper[29252]: I1203 20:22:30.072608 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcpd\" (UniqueName: \"kubernetes.io/projected/bf8c7606-3354-4927-931d-a0ca4721acd6-kube-api-access-xhcpd\") pod \"openstack-operator-index-4sgv6\" (UID: \"bf8c7606-3354-4927-931d-a0ca4721acd6\") " pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:30.088930 master-0 kubenswrapper[29252]: I1203 20:22:30.088860 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcpd\" (UniqueName: \"kubernetes.io/projected/bf8c7606-3354-4927-931d-a0ca4721acd6-kube-api-access-xhcpd\") pod \"openstack-operator-index-4sgv6\" (UID: \"bf8c7606-3354-4927-931d-a0ca4721acd6\") " pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:30.243348 master-0 kubenswrapper[29252]: I1203 20:22:30.243229 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:30.673258 master-0 kubenswrapper[29252]: I1203 20:22:30.673157 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4sgv6"] Dec 03 20:22:30.687195 master-0 kubenswrapper[29252]: W1203 20:22:30.687072 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8c7606_3354_4927_931d_a0ca4721acd6.slice/crio-c688cc9fb5149de60b3b550a9af5f6e47445dfacdd907b93b01ae5b2cc75c9c8 WatchSource:0}: Error finding container c688cc9fb5149de60b3b550a9af5f6e47445dfacdd907b93b01ae5b2cc75c9c8: Status 404 returned error can't find the container with id c688cc9fb5149de60b3b550a9af5f6e47445dfacdd907b93b01ae5b2cc75c9c8 Dec 03 20:22:30.845808 master-0 kubenswrapper[29252]: I1203 20:22:30.845717 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4sgv6" event={"ID":"bf8c7606-3354-4927-931d-a0ca4721acd6","Type":"ContainerStarted","Data":"c688cc9fb5149de60b3b550a9af5f6e47445dfacdd907b93b01ae5b2cc75c9c8"} Dec 03 20:22:31.854638 master-0 kubenswrapper[29252]: I1203 20:22:31.854509 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4sgv6" event={"ID":"bf8c7606-3354-4927-931d-a0ca4721acd6","Type":"ContainerStarted","Data":"75d9760f0668b4056f61934e436ef847d3b05e1e3069c301028a227015cfb80f"} Dec 03 20:22:31.889246 master-0 kubenswrapper[29252]: I1203 20:22:31.889152 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4sgv6" podStartSLOduration=2.01268807 podStartE2EDuration="2.889129324s" podCreationTimestamp="2025-12-03 20:22:29 +0000 UTC" firstStartedPulling="2025-12-03 20:22:30.690015923 +0000 UTC m=+785.503560876" lastFinishedPulling="2025-12-03 20:22:31.566457177 +0000 UTC m=+786.380002130" observedRunningTime="2025-12-03 20:22:31.878529555 +0000 UTC m=+786.692074508" watchObservedRunningTime="2025-12-03 20:22:31.889129324 +0000 UTC m=+786.702674287" Dec 03 20:22:33.739680 master-0 kubenswrapper[29252]: I1203 20:22:33.739613 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:33.741931 master-0 kubenswrapper[29252]: I1203 20:22:33.741879 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:33.782168 master-0 kubenswrapper[29252]: I1203 20:22:33.782084 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5656567747-w9bgn" podUID="2329bde6-b226-4fca-864d-b152ccf49cf9" containerName="console" containerID="cri-o://2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d" gracePeriod=15 Dec 03 20:22:33.881262 master-0 kubenswrapper[29252]: I1203 20:22:33.881158 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:33.882756 master-0 kubenswrapper[29252]: I1203 20:22:33.882719 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-k68ws" Dec 03 20:22:34.314730 master-0 kubenswrapper[29252]: I1203 20:22:34.314647 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5656567747-w9bgn_2329bde6-b226-4fca-864d-b152ccf49cf9/console/0.log" Dec 03 20:22:34.314730 master-0 kubenswrapper[29252]: I1203 20:22:34.314734 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:22:34.389075 master-0 kubenswrapper[29252]: I1203 20:22:34.388998 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389402 master-0 kubenswrapper[29252]: I1203 20:22:34.389133 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdq4\" (UniqueName: \"kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389402 master-0 kubenswrapper[29252]: I1203 20:22:34.389181 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389402 master-0 kubenswrapper[29252]: I1203 20:22:34.389230 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389402 master-0 kubenswrapper[29252]: I1203 20:22:34.389319 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389576 master-0 kubenswrapper[29252]: I1203 20:22:34.389424 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.389576 master-0 kubenswrapper[29252]: I1203 20:22:34.389513 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert\") pod \"2329bde6-b226-4fca-864d-b152ccf49cf9\" (UID: \"2329bde6-b226-4fca-864d-b152ccf49cf9\") " Dec 03 20:22:34.391577 master-0 kubenswrapper[29252]: I1203 20:22:34.391498 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:22:34.391577 master-0 kubenswrapper[29252]: I1203 20:22:34.391540 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config" (OuterVolumeSpecName: "console-config") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:22:34.392104 master-0 kubenswrapper[29252]: I1203 20:22:34.392052 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:22:34.392344 master-0 kubenswrapper[29252]: I1203 20:22:34.392283 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca" (OuterVolumeSpecName: "service-ca") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:22:34.393305 master-0 kubenswrapper[29252]: I1203 20:22:34.393244 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4" (OuterVolumeSpecName: "kube-api-access-dkdq4") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "kube-api-access-dkdq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:22:34.396403 master-0 kubenswrapper[29252]: I1203 20:22:34.396341 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:22:34.397522 master-0 kubenswrapper[29252]: I1203 20:22:34.397448 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2329bde6-b226-4fca-864d-b152ccf49cf9" (UID: "2329bde6-b226-4fca-864d-b152ccf49cf9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491881 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkdq4\" (UniqueName: \"kubernetes.io/projected/2329bde6-b226-4fca-864d-b152ccf49cf9-kube-api-access-dkdq4\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491926 29252 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491936 29252 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491945 29252 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491954 29252 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2329bde6-b226-4fca-864d-b152ccf49cf9-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491965 29252 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.491964 master-0 kubenswrapper[29252]: I1203 20:22:34.491974 29252 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2329bde6-b226-4fca-864d-b152ccf49cf9-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:34.933055 master-0 kubenswrapper[29252]: I1203 20:22:34.932961 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5656567747-w9bgn_2329bde6-b226-4fca-864d-b152ccf49cf9/console/0.log" Dec 03 20:22:34.933055 master-0 kubenswrapper[29252]: I1203 20:22:34.933049 29252 generic.go:334] "Generic (PLEG): container finished" podID="2329bde6-b226-4fca-864d-b152ccf49cf9" containerID="2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d" exitCode=2 Dec 03 20:22:34.934166 master-0 kubenswrapper[29252]: I1203 20:22:34.933158 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5656567747-w9bgn" Dec 03 20:22:34.934166 master-0 kubenswrapper[29252]: I1203 20:22:34.933223 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5656567747-w9bgn" event={"ID":"2329bde6-b226-4fca-864d-b152ccf49cf9","Type":"ContainerDied","Data":"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d"} Dec 03 20:22:34.934166 master-0 kubenswrapper[29252]: I1203 20:22:34.933260 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5656567747-w9bgn" event={"ID":"2329bde6-b226-4fca-864d-b152ccf49cf9","Type":"ContainerDied","Data":"7950d43f53958f94be90826f9245aa2b01033789ccdd0dbeec841b3d3de5f8a9"} Dec 03 20:22:34.934166 master-0 kubenswrapper[29252]: I1203 20:22:34.933281 29252 scope.go:117] "RemoveContainer" containerID="2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d" Dec 03 20:22:34.955441 master-0 kubenswrapper[29252]: I1203 20:22:34.955404 29252 scope.go:117] "RemoveContainer" containerID="2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d" Dec 03 20:22:34.955885 master-0 kubenswrapper[29252]: E1203 20:22:34.955840 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d\": container with ID starting with 2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d not found: ID does not exist" containerID="2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d" Dec 03 20:22:34.955950 master-0 kubenswrapper[29252]: I1203 20:22:34.955894 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d"} err="failed to get container status \"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d\": rpc error: code = NotFound desc = could not find container \"2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d\": container with ID starting with 2d07197a9cbe389114d58e42fc85f3a4ac1454d3224bee1077d7c411b3f2711d not found: ID does not exist" Dec 03 20:22:35.001813 master-0 kubenswrapper[29252]: I1203 20:22:34.999264 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:22:35.008685 master-0 kubenswrapper[29252]: I1203 20:22:35.008552 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5656567747-w9bgn"] Dec 03 20:22:35.445216 master-0 kubenswrapper[29252]: I1203 20:22:35.445120 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2329bde6-b226-4fca-864d-b152ccf49cf9" path="/var/lib/kubelet/pods/2329bde6-b226-4fca-864d-b152ccf49cf9/volumes" Dec 03 20:22:40.244446 master-0 kubenswrapper[29252]: I1203 20:22:40.244380 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:40.245051 master-0 kubenswrapper[29252]: I1203 20:22:40.244474 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:40.278715 master-0 kubenswrapper[29252]: I1203 20:22:40.278615 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:41.043565 master-0 kubenswrapper[29252]: I1203 20:22:41.043470 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4sgv6" Dec 03 20:22:42.661801 master-0 kubenswrapper[29252]: I1203 20:22:42.657960 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5"] Dec 03 20:22:42.661801 master-0 kubenswrapper[29252]: E1203 20:22:42.658482 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2329bde6-b226-4fca-864d-b152ccf49cf9" containerName="console" Dec 03 20:22:42.661801 master-0 kubenswrapper[29252]: I1203 20:22:42.658502 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="2329bde6-b226-4fca-864d-b152ccf49cf9" containerName="console" Dec 03 20:22:42.661801 master-0 kubenswrapper[29252]: I1203 20:22:42.658813 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="2329bde6-b226-4fca-864d-b152ccf49cf9" containerName="console" Dec 03 20:22:42.661801 master-0 kubenswrapper[29252]: I1203 20:22:42.660688 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.670818 master-0 kubenswrapper[29252]: I1203 20:22:42.668045 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5"] Dec 03 20:22:42.765538 master-0 kubenswrapper[29252]: I1203 20:22:42.765476 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.765749 master-0 kubenswrapper[29252]: I1203 20:22:42.765550 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwdkw\" (UniqueName: \"kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.765749 master-0 kubenswrapper[29252]: I1203 20:22:42.765602 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.867425 master-0 kubenswrapper[29252]: I1203 20:22:42.867362 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.867425 master-0 kubenswrapper[29252]: I1203 20:22:42.867428 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwdkw\" (UniqueName: \"kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.867891 master-0 kubenswrapper[29252]: I1203 20:22:42.867481 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.868206 master-0 kubenswrapper[29252]: I1203 20:22:42.868162 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.868318 master-0 kubenswrapper[29252]: I1203 20:22:42.868242 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.887025 master-0 kubenswrapper[29252]: I1203 20:22:42.886976 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwdkw\" (UniqueName: \"kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:42.996419 master-0 kubenswrapper[29252]: I1203 20:22:42.996296 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:43.448846 master-0 kubenswrapper[29252]: I1203 20:22:43.448787 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5"] Dec 03 20:22:43.463209 master-0 kubenswrapper[29252]: W1203 20:22:43.463159 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c59077_74ee_4b0e_bda7_06c2b0a2cae4.slice/crio-312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e WatchSource:0}: Error finding container 312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e: Status 404 returned error can't find the container with id 312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e Dec 03 20:22:44.041153 master-0 kubenswrapper[29252]: I1203 20:22:44.041103 29252 generic.go:334] "Generic (PLEG): container finished" podID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerID="29cf77eb859f39a65809811cc2f1c6d6dbda846aeaa161d290002d1f06a0fe27" exitCode=0 Dec 03 20:22:44.042019 master-0 kubenswrapper[29252]: I1203 20:22:44.041252 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerDied","Data":"29cf77eb859f39a65809811cc2f1c6d6dbda846aeaa161d290002d1f06a0fe27"} Dec 03 20:22:44.042205 master-0 kubenswrapper[29252]: I1203 20:22:44.042177 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerStarted","Data":"312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e"} Dec 03 20:22:45.055142 master-0 kubenswrapper[29252]: I1203 20:22:45.054980 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerStarted","Data":"5d6427a98935143aba0c5fc27054aed96c1dcb1021f2897885c30b92f4aa49cb"} Dec 03 20:22:46.069926 master-0 kubenswrapper[29252]: I1203 20:22:46.069836 29252 generic.go:334] "Generic (PLEG): container finished" podID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerID="5d6427a98935143aba0c5fc27054aed96c1dcb1021f2897885c30b92f4aa49cb" exitCode=0 Dec 03 20:22:46.069926 master-0 kubenswrapper[29252]: I1203 20:22:46.069886 29252 generic.go:334] "Generic (PLEG): container finished" podID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerID="0005fab00e1327fb83150d84280034218b2a106569a9ca5b227380c3499d2f84" exitCode=0 Dec 03 20:22:46.069926 master-0 kubenswrapper[29252]: I1203 20:22:46.069897 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerDied","Data":"5d6427a98935143aba0c5fc27054aed96c1dcb1021f2897885c30b92f4aa49cb"} Dec 03 20:22:46.070548 master-0 kubenswrapper[29252]: I1203 20:22:46.069996 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerDied","Data":"0005fab00e1327fb83150d84280034218b2a106569a9ca5b227380c3499d2f84"} Dec 03 20:22:47.494665 master-0 kubenswrapper[29252]: I1203 20:22:47.494616 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:47.563236 master-0 kubenswrapper[29252]: I1203 20:22:47.563106 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle\") pod \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " Dec 03 20:22:47.563236 master-0 kubenswrapper[29252]: I1203 20:22:47.563239 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util\") pod \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " Dec 03 20:22:47.563720 master-0 kubenswrapper[29252]: I1203 20:22:47.563297 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwdkw\" (UniqueName: \"kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw\") pod \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\" (UID: \"26c59077-74ee-4b0e-bda7-06c2b0a2cae4\") " Dec 03 20:22:47.565122 master-0 kubenswrapper[29252]: I1203 20:22:47.565037 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle" (OuterVolumeSpecName: "bundle") pod "26c59077-74ee-4b0e-bda7-06c2b0a2cae4" (UID: "26c59077-74ee-4b0e-bda7-06c2b0a2cae4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:22:47.566667 master-0 kubenswrapper[29252]: I1203 20:22:47.566597 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw" (OuterVolumeSpecName: "kube-api-access-wwdkw") pod "26c59077-74ee-4b0e-bda7-06c2b0a2cae4" (UID: "26c59077-74ee-4b0e-bda7-06c2b0a2cae4"). InnerVolumeSpecName "kube-api-access-wwdkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:22:47.578560 master-0 kubenswrapper[29252]: I1203 20:22:47.578492 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util" (OuterVolumeSpecName: "util") pod "26c59077-74ee-4b0e-bda7-06c2b0a2cae4" (UID: "26c59077-74ee-4b0e-bda7-06c2b0a2cae4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:22:47.666807 master-0 kubenswrapper[29252]: I1203 20:22:47.666658 29252 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:47.666807 master-0 kubenswrapper[29252]: I1203 20:22:47.666759 29252 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-util\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:47.666807 master-0 kubenswrapper[29252]: I1203 20:22:47.666799 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwdkw\" (UniqueName: \"kubernetes.io/projected/26c59077-74ee-4b0e-bda7-06c2b0a2cae4-kube-api-access-wwdkw\") on node \"master-0\" DevicePath \"\"" Dec 03 20:22:48.104893 master-0 kubenswrapper[29252]: I1203 20:22:48.104811 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" event={"ID":"26c59077-74ee-4b0e-bda7-06c2b0a2cae4","Type":"ContainerDied","Data":"312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e"} Dec 03 20:22:48.104893 master-0 kubenswrapper[29252]: I1203 20:22:48.104876 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="312c3eb81e303b424814315ab8297db9066abdbc058d60b8cb169389348bdf4e" Dec 03 20:22:48.104893 master-0 kubenswrapper[29252]: I1203 20:22:48.104902 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5" Dec 03 20:22:55.313717 master-0 kubenswrapper[29252]: I1203 20:22:55.313637 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: E1203 20:22:55.314043 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="util" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: I1203 20:22:55.314057 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="util" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: E1203 20:22:55.314088 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="pull" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: I1203 20:22:55.314096 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="pull" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: E1203 20:22:55.314132 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="extract" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: I1203 20:22:55.314139 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="extract" Dec 03 20:22:55.314523 master-0 kubenswrapper[29252]: I1203 20:22:55.314297 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c59077-74ee-4b0e-bda7-06c2b0a2cae4" containerName="extract" Dec 03 20:22:55.314856 master-0 kubenswrapper[29252]: I1203 20:22:55.314848 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:22:55.345693 master-0 kubenswrapper[29252]: I1203 20:22:55.345637 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:22:55.405826 master-0 kubenswrapper[29252]: I1203 20:22:55.405098 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tfb\" (UniqueName: \"kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-dvqzl\" (UID: \"9e7d177b-251a-48b9-815a-b69ce107d89c\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:22:55.507401 master-0 kubenswrapper[29252]: I1203 20:22:55.507333 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tfb\" (UniqueName: \"kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-dvqzl\" (UID: \"9e7d177b-251a-48b9-815a-b69ce107d89c\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:22:55.527319 master-0 kubenswrapper[29252]: I1203 20:22:55.527269 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tfb\" (UniqueName: \"kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-dvqzl\" (UID: \"9e7d177b-251a-48b9-815a-b69ce107d89c\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:22:55.635011 master-0 kubenswrapper[29252]: I1203 20:22:55.634945 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:22:56.127592 master-0 kubenswrapper[29252]: I1203 20:22:56.127519 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:22:56.129065 master-0 kubenswrapper[29252]: W1203 20:22:56.129010 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e7d177b_251a_48b9_815a_b69ce107d89c.slice/crio-fa6a636c50fab29be1db0ae6f3a6e1e714d3be509899844fcd8a587bdb7fac82 WatchSource:0}: Error finding container fa6a636c50fab29be1db0ae6f3a6e1e714d3be509899844fcd8a587bdb7fac82: Status 404 returned error can't find the container with id fa6a636c50fab29be1db0ae6f3a6e1e714d3be509899844fcd8a587bdb7fac82 Dec 03 20:22:56.189654 master-0 kubenswrapper[29252]: I1203 20:22:56.189591 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" event={"ID":"9e7d177b-251a-48b9-815a-b69ce107d89c","Type":"ContainerStarted","Data":"fa6a636c50fab29be1db0ae6f3a6e1e714d3be509899844fcd8a587bdb7fac82"} Dec 03 20:23:01.252250 master-0 kubenswrapper[29252]: I1203 20:23:01.252154 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" event={"ID":"9e7d177b-251a-48b9-815a-b69ce107d89c","Type":"ContainerStarted","Data":"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92"} Dec 03 20:23:01.253302 master-0 kubenswrapper[29252]: I1203 20:23:01.252311 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:23:01.292527 master-0 kubenswrapper[29252]: I1203 20:23:01.292364 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" podStartSLOduration=2.012594364 podStartE2EDuration="6.292330136s" podCreationTimestamp="2025-12-03 20:22:55 +0000 UTC" firstStartedPulling="2025-12-03 20:22:56.131823064 +0000 UTC m=+810.945368027" lastFinishedPulling="2025-12-03 20:23:00.411558846 +0000 UTC m=+815.225103799" observedRunningTime="2025-12-03 20:23:01.283061189 +0000 UTC m=+816.096606152" watchObservedRunningTime="2025-12-03 20:23:01.292330136 +0000 UTC m=+816.105875149" Dec 03 20:23:05.639027 master-0 kubenswrapper[29252]: I1203 20:23:05.638938 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:23:12.562088 master-0 kubenswrapper[29252]: I1203 20:23:12.561991 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g"] Dec 03 20:23:12.563610 master-0 kubenswrapper[29252]: I1203 20:23:12.563570 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:12.588734 master-0 kubenswrapper[29252]: I1203 20:23:12.588660 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g"] Dec 03 20:23:12.628838 master-0 kubenswrapper[29252]: I1203 20:23:12.627051 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4mh\" (UniqueName: \"kubernetes.io/projected/385e1bf4-f4ce-4589-92ad-124932b9c490-kube-api-access-cx4mh\") pod \"openstack-operator-controller-operator-7b84d49558-q6t9g\" (UID: \"385e1bf4-f4ce-4589-92ad-124932b9c490\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:12.728984 master-0 kubenswrapper[29252]: I1203 20:23:12.728912 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4mh\" (UniqueName: \"kubernetes.io/projected/385e1bf4-f4ce-4589-92ad-124932b9c490-kube-api-access-cx4mh\") pod \"openstack-operator-controller-operator-7b84d49558-q6t9g\" (UID: \"385e1bf4-f4ce-4589-92ad-124932b9c490\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:12.749052 master-0 kubenswrapper[29252]: I1203 20:23:12.748548 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4mh\" (UniqueName: \"kubernetes.io/projected/385e1bf4-f4ce-4589-92ad-124932b9c490-kube-api-access-cx4mh\") pod \"openstack-operator-controller-operator-7b84d49558-q6t9g\" (UID: \"385e1bf4-f4ce-4589-92ad-124932b9c490\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:12.881949 master-0 kubenswrapper[29252]: I1203 20:23:12.881867 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:13.502937 master-0 kubenswrapper[29252]: I1203 20:23:13.502824 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g"] Dec 03 20:23:14.390037 master-0 kubenswrapper[29252]: I1203 20:23:14.389916 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" event={"ID":"385e1bf4-f4ce-4589-92ad-124932b9c490","Type":"ContainerStarted","Data":"0cadb6b73582535b8662e506d231175d2cea335067aa0b3f951dc5fd11b980ae"} Dec 03 20:23:14.390037 master-0 kubenswrapper[29252]: I1203 20:23:14.389977 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" event={"ID":"385e1bf4-f4ce-4589-92ad-124932b9c490","Type":"ContainerStarted","Data":"58e7544ef3e19bce06e62aabb45a1b1dcc966ef0e1e39db4c8802d3e201d4048"} Dec 03 20:23:14.391361 master-0 kubenswrapper[29252]: I1203 20:23:14.391312 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:14.429396 master-0 kubenswrapper[29252]: I1203 20:23:14.429308 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" podStartSLOduration=2.429281205 podStartE2EDuration="2.429281205s" podCreationTimestamp="2025-12-03 20:23:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:23:14.424812746 +0000 UTC m=+829.238357749" watchObservedRunningTime="2025-12-03 20:23:14.429281205 +0000 UTC m=+829.242826168" Dec 03 20:23:22.885718 master-0 kubenswrapper[29252]: I1203 20:23:22.885666 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-q6t9g" Dec 03 20:23:22.984421 master-0 kubenswrapper[29252]: I1203 20:23:22.984367 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:23:22.985000 master-0 kubenswrapper[29252]: I1203 20:23:22.984968 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" podUID="9e7d177b-251a-48b9-815a-b69ce107d89c" containerName="operator" containerID="cri-o://d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92" gracePeriod=10 Dec 03 20:23:23.428249 master-0 kubenswrapper[29252]: I1203 20:23:23.428183 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:23:23.474223 master-0 kubenswrapper[29252]: I1203 20:23:23.474075 29252 generic.go:334] "Generic (PLEG): container finished" podID="9e7d177b-251a-48b9-815a-b69ce107d89c" containerID="d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92" exitCode=0 Dec 03 20:23:23.474223 master-0 kubenswrapper[29252]: I1203 20:23:23.474181 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" Dec 03 20:23:23.474461 master-0 kubenswrapper[29252]: I1203 20:23:23.474178 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" event={"ID":"9e7d177b-251a-48b9-815a-b69ce107d89c","Type":"ContainerDied","Data":"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92"} Dec 03 20:23:23.474461 master-0 kubenswrapper[29252]: I1203 20:23:23.474322 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl" event={"ID":"9e7d177b-251a-48b9-815a-b69ce107d89c","Type":"ContainerDied","Data":"fa6a636c50fab29be1db0ae6f3a6e1e714d3be509899844fcd8a587bdb7fac82"} Dec 03 20:23:23.474461 master-0 kubenswrapper[29252]: I1203 20:23:23.474345 29252 scope.go:117] "RemoveContainer" containerID="d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92" Dec 03 20:23:23.501760 master-0 kubenswrapper[29252]: I1203 20:23:23.501703 29252 scope.go:117] "RemoveContainer" containerID="d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92" Dec 03 20:23:23.503719 master-0 kubenswrapper[29252]: E1203 20:23:23.503618 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92\": container with ID starting with d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92 not found: ID does not exist" containerID="d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92" Dec 03 20:23:23.503799 master-0 kubenswrapper[29252]: I1203 20:23:23.503734 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92"} err="failed to get container status \"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92\": rpc error: code = NotFound desc = could not find container \"d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92\": container with ID starting with d4a7a73b789ae1c3cc2e61a499786a7bbee6f76163a62254fbe6d44832335d92 not found: ID does not exist" Dec 03 20:23:23.545087 master-0 kubenswrapper[29252]: I1203 20:23:23.545021 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8tfb\" (UniqueName: \"kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb\") pod \"9e7d177b-251a-48b9-815a-b69ce107d89c\" (UID: \"9e7d177b-251a-48b9-815a-b69ce107d89c\") " Dec 03 20:23:23.548304 master-0 kubenswrapper[29252]: I1203 20:23:23.548248 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb" (OuterVolumeSpecName: "kube-api-access-f8tfb") pod "9e7d177b-251a-48b9-815a-b69ce107d89c" (UID: "9e7d177b-251a-48b9-815a-b69ce107d89c"). InnerVolumeSpecName "kube-api-access-f8tfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:23:23.646842 master-0 kubenswrapper[29252]: I1203 20:23:23.646700 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8tfb\" (UniqueName: \"kubernetes.io/projected/9e7d177b-251a-48b9-815a-b69ce107d89c-kube-api-access-f8tfb\") on node \"master-0\" DevicePath \"\"" Dec 03 20:23:23.815698 master-0 kubenswrapper[29252]: I1203 20:23:23.815638 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:23:23.830733 master-0 kubenswrapper[29252]: I1203 20:23:23.830669 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-dvqzl"] Dec 03 20:23:25.430178 master-0 kubenswrapper[29252]: I1203 20:23:25.430123 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7d177b-251a-48b9-815a-b69ce107d89c" path="/var/lib/kubelet/pods/9e7d177b-251a-48b9-815a-b69ce107d89c/volumes" Dec 03 20:24:18.658719 master-0 kubenswrapper[29252]: I1203 20:24:18.658674 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8"] Dec 03 20:24:18.659886 master-0 kubenswrapper[29252]: E1203 20:24:18.659866 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7d177b-251a-48b9-815a-b69ce107d89c" containerName="operator" Dec 03 20:24:18.659996 master-0 kubenswrapper[29252]: I1203 20:24:18.659982 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7d177b-251a-48b9-815a-b69ce107d89c" containerName="operator" Dec 03 20:24:18.666765 master-0 kubenswrapper[29252]: I1203 20:24:18.666719 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7d177b-251a-48b9-815a-b69ce107d89c" containerName="operator" Dec 03 20:24:18.674575 master-0 kubenswrapper[29252]: I1203 20:24:18.674233 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r"] Dec 03 20:24:18.678794 master-0 kubenswrapper[29252]: I1203 20:24:18.675742 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:18.678794 master-0 kubenswrapper[29252]: I1203 20:24:18.677756 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:18.678794 master-0 kubenswrapper[29252]: I1203 20:24:18.678633 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w"] Dec 03 20:24:18.682004 master-0 kubenswrapper[29252]: I1203 20:24:18.680293 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:18.689164 master-0 kubenswrapper[29252]: I1203 20:24:18.689017 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r"] Dec 03 20:24:18.693174 master-0 kubenswrapper[29252]: I1203 20:24:18.693138 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8"] Dec 03 20:24:18.706757 master-0 kubenswrapper[29252]: I1203 20:24:18.701383 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9sfg\" (UniqueName: \"kubernetes.io/projected/4bfaaa2e-15b3-40fb-93c2-994c4a38559d-kube-api-access-z9sfg\") pod \"cinder-operator-controller-manager-f8856dd79-fsg7r\" (UID: \"4bfaaa2e-15b3-40fb-93c2-994c4a38559d\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:18.706757 master-0 kubenswrapper[29252]: I1203 20:24:18.701831 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbs4\" (UniqueName: \"kubernetes.io/projected/b6ca362c-809a-47d1-8b68-9848967d382a-kube-api-access-vzbs4\") pod \"barbican-operator-controller-manager-5cd89994b5-78ft8\" (UID: \"b6ca362c-809a-47d1-8b68-9848967d382a\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:18.706757 master-0 kubenswrapper[29252]: I1203 20:24:18.701901 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7db6b\" (UniqueName: \"kubernetes.io/projected/615341bc-bf59-4b24-9baa-3223edd30ad0-kube-api-access-7db6b\") pod \"designate-operator-controller-manager-84bc9f68f5-pnq2w\" (UID: \"615341bc-bf59-4b24-9baa-3223edd30ad0\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:18.725568 master-0 kubenswrapper[29252]: I1203 20:24:18.725511 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9"] Dec 03 20:24:18.732765 master-0 kubenswrapper[29252]: I1203 20:24:18.730098 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:18.757940 master-0 kubenswrapper[29252]: I1203 20:24:18.757887 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w"] Dec 03 20:24:18.786955 master-0 kubenswrapper[29252]: I1203 20:24:18.786905 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8"] Dec 03 20:24:18.788461 master-0 kubenswrapper[29252]: I1203 20:24:18.788432 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:18.803160 master-0 kubenswrapper[29252]: I1203 20:24:18.800843 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9"] Dec 03 20:24:18.803160 master-0 kubenswrapper[29252]: I1203 20:24:18.803013 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rmk\" (UniqueName: \"kubernetes.io/projected/c2fed802-28a0-40d3-b422-581c334d8bc5-kube-api-access-j4rmk\") pod \"glance-operator-controller-manager-78cd4f7769-7lgz9\" (UID: \"c2fed802-28a0-40d3-b422-581c334d8bc5\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:18.803394 master-0 kubenswrapper[29252]: I1203 20:24:18.803190 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbs4\" (UniqueName: \"kubernetes.io/projected/b6ca362c-809a-47d1-8b68-9848967d382a-kube-api-access-vzbs4\") pod \"barbican-operator-controller-manager-5cd89994b5-78ft8\" (UID: \"b6ca362c-809a-47d1-8b68-9848967d382a\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:18.803973 master-0 kubenswrapper[29252]: I1203 20:24:18.803446 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7db6b\" (UniqueName: \"kubernetes.io/projected/615341bc-bf59-4b24-9baa-3223edd30ad0-kube-api-access-7db6b\") pod \"designate-operator-controller-manager-84bc9f68f5-pnq2w\" (UID: \"615341bc-bf59-4b24-9baa-3223edd30ad0\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:18.803973 master-0 kubenswrapper[29252]: I1203 20:24:18.803570 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9sfg\" (UniqueName: \"kubernetes.io/projected/4bfaaa2e-15b3-40fb-93c2-994c4a38559d-kube-api-access-z9sfg\") pod \"cinder-operator-controller-manager-f8856dd79-fsg7r\" (UID: \"4bfaaa2e-15b3-40fb-93c2-994c4a38559d\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:18.803973 master-0 kubenswrapper[29252]: I1203 20:24:18.803652 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7gqq\" (UniqueName: \"kubernetes.io/projected/7f965ddc-ff13-4f8e-b20c-aad918a7be33-kube-api-access-q7gqq\") pod \"heat-operator-controller-manager-7fd96594c7-bhdr8\" (UID: \"7f965ddc-ff13-4f8e-b20c-aad918a7be33\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:18.811426 master-0 kubenswrapper[29252]: I1203 20:24:18.811357 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8"] Dec 03 20:24:18.823715 master-0 kubenswrapper[29252]: I1203 20:24:18.823305 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbs4\" (UniqueName: \"kubernetes.io/projected/b6ca362c-809a-47d1-8b68-9848967d382a-kube-api-access-vzbs4\") pod \"barbican-operator-controller-manager-5cd89994b5-78ft8\" (UID: \"b6ca362c-809a-47d1-8b68-9848967d382a\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:18.826423 master-0 kubenswrapper[29252]: I1203 20:24:18.824427 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm"] Dec 03 20:24:18.826423 master-0 kubenswrapper[29252]: I1203 20:24:18.825756 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:18.840639 master-0 kubenswrapper[29252]: I1203 20:24:18.840591 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9sfg\" (UniqueName: \"kubernetes.io/projected/4bfaaa2e-15b3-40fb-93c2-994c4a38559d-kube-api-access-z9sfg\") pod \"cinder-operator-controller-manager-f8856dd79-fsg7r\" (UID: \"4bfaaa2e-15b3-40fb-93c2-994c4a38559d\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:18.840877 master-0 kubenswrapper[29252]: I1203 20:24:18.840670 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm"] Dec 03 20:24:18.878467 master-0 kubenswrapper[29252]: I1203 20:24:18.878382 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7db6b\" (UniqueName: \"kubernetes.io/projected/615341bc-bf59-4b24-9baa-3223edd30ad0-kube-api-access-7db6b\") pod \"designate-operator-controller-manager-84bc9f68f5-pnq2w\" (UID: \"615341bc-bf59-4b24-9baa-3223edd30ad0\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:18.886569 master-0 kubenswrapper[29252]: I1203 20:24:18.886516 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt"] Dec 03 20:24:18.888231 master-0 kubenswrapper[29252]: I1203 20:24:18.888196 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:18.896174 master-0 kubenswrapper[29252]: I1203 20:24:18.895619 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 20:24:18.917950 master-0 kubenswrapper[29252]: I1203 20:24:18.910808 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7gqq\" (UniqueName: \"kubernetes.io/projected/7f965ddc-ff13-4f8e-b20c-aad918a7be33-kube-api-access-q7gqq\") pod \"heat-operator-controller-manager-7fd96594c7-bhdr8\" (UID: \"7f965ddc-ff13-4f8e-b20c-aad918a7be33\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:18.917950 master-0 kubenswrapper[29252]: I1203 20:24:18.910873 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rmk\" (UniqueName: \"kubernetes.io/projected/c2fed802-28a0-40d3-b422-581c334d8bc5-kube-api-access-j4rmk\") pod \"glance-operator-controller-manager-78cd4f7769-7lgz9\" (UID: \"c2fed802-28a0-40d3-b422-581c334d8bc5\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:18.942365 master-0 kubenswrapper[29252]: I1203 20:24:18.940840 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt"] Dec 03 20:24:18.967869 master-0 kubenswrapper[29252]: I1203 20:24:18.960428 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rmk\" (UniqueName: \"kubernetes.io/projected/c2fed802-28a0-40d3-b422-581c334d8bc5-kube-api-access-j4rmk\") pod \"glance-operator-controller-manager-78cd4f7769-7lgz9\" (UID: \"c2fed802-28a0-40d3-b422-581c334d8bc5\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:18.983607 master-0 kubenswrapper[29252]: I1203 20:24:18.980454 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7gqq\" (UniqueName: \"kubernetes.io/projected/7f965ddc-ff13-4f8e-b20c-aad918a7be33-kube-api-access-q7gqq\") pod \"heat-operator-controller-manager-7fd96594c7-bhdr8\" (UID: \"7f965ddc-ff13-4f8e-b20c-aad918a7be33\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.009031 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh"] Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.010745 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.011451 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.013306 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.013433 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzw9\" (UniqueName: \"kubernetes.io/projected/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-kube-api-access-qjzw9\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.013465 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpqb\" (UniqueName: \"kubernetes.io/projected/4cf77700-7d9a-4d7e-bf0a-71777fa32e55-kube-api-access-tqpqb\") pod \"horizon-operator-controller-manager-f6cc97788-v6spm\" (UID: \"4cf77700-7d9a-4d7e-bf0a-71777fa32e55\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:19.017801 master-0 kubenswrapper[29252]: I1203 20:24:19.013516 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdvx\" (UniqueName: \"kubernetes.io/projected/7d7bb0ae-4a5d-4196-a340-51fca6907f3a-kube-api-access-mkdvx\") pod \"ironic-operator-controller-manager-7c9bfd6967-kgmrh\" (UID: \"7d7bb0ae-4a5d-4196-a340-51fca6907f3a\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:19.018206 master-0 kubenswrapper[29252]: I1203 20:24:19.017992 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt"] Dec 03 20:24:19.025984 master-0 kubenswrapper[29252]: I1203 20:24:19.021695 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:19.055035 master-0 kubenswrapper[29252]: I1203 20:24:19.053673 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh"] Dec 03 20:24:19.055035 master-0 kubenswrapper[29252]: I1203 20:24:19.054173 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:19.075897 master-0 kubenswrapper[29252]: I1203 20:24:19.058977 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:19.091347 master-0 kubenswrapper[29252]: I1203 20:24:19.090720 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:19.145885 master-0 kubenswrapper[29252]: I1203 20:24:19.122145 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzw9\" (UniqueName: \"kubernetes.io/projected/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-kube-api-access-qjzw9\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.145885 master-0 kubenswrapper[29252]: I1203 20:24:19.122207 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpqb\" (UniqueName: \"kubernetes.io/projected/4cf77700-7d9a-4d7e-bf0a-71777fa32e55-kube-api-access-tqpqb\") pod \"horizon-operator-controller-manager-f6cc97788-v6spm\" (UID: \"4cf77700-7d9a-4d7e-bf0a-71777fa32e55\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:19.145885 master-0 kubenswrapper[29252]: I1203 20:24:19.122257 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdvx\" (UniqueName: \"kubernetes.io/projected/7d7bb0ae-4a5d-4196-a340-51fca6907f3a-kube-api-access-mkdvx\") pod \"ironic-operator-controller-manager-7c9bfd6967-kgmrh\" (UID: \"7d7bb0ae-4a5d-4196-a340-51fca6907f3a\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:19.145885 master-0 kubenswrapper[29252]: I1203 20:24:19.124887 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:19.145885 master-0 kubenswrapper[29252]: I1203 20:24:19.140923 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.172854 master-0 kubenswrapper[29252]: E1203 20:24:19.170913 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:19.172854 master-0 kubenswrapper[29252]: E1203 20:24:19.171020 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:19.670998545 +0000 UTC m=+894.484543498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:19.199946 master-0 kubenswrapper[29252]: I1203 20:24:19.190418 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpqb\" (UniqueName: \"kubernetes.io/projected/4cf77700-7d9a-4d7e-bf0a-71777fa32e55-kube-api-access-tqpqb\") pod \"horizon-operator-controller-manager-f6cc97788-v6spm\" (UID: \"4cf77700-7d9a-4d7e-bf0a-71777fa32e55\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:19.199946 master-0 kubenswrapper[29252]: I1203 20:24:19.190822 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt"] Dec 03 20:24:19.211493 master-0 kubenswrapper[29252]: I1203 20:24:19.202089 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzw9\" (UniqueName: \"kubernetes.io/projected/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-kube-api-access-qjzw9\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.236559 master-0 kubenswrapper[29252]: I1203 20:24:19.233105 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk"] Dec 03 20:24:19.236559 master-0 kubenswrapper[29252]: I1203 20:24:19.235823 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:19.253451 master-0 kubenswrapper[29252]: I1203 20:24:19.245230 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdvx\" (UniqueName: \"kubernetes.io/projected/7d7bb0ae-4a5d-4196-a340-51fca6907f3a-kube-api-access-mkdvx\") pod \"ironic-operator-controller-manager-7c9bfd6967-kgmrh\" (UID: \"7d7bb0ae-4a5d-4196-a340-51fca6907f3a\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:19.293896 master-0 kubenswrapper[29252]: I1203 20:24:19.272369 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258"] Dec 03 20:24:19.293896 master-0 kubenswrapper[29252]: I1203 20:24:19.274807 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmg9\" (UniqueName: \"kubernetes.io/projected/28196001-edc5-4152-830f-7712255d742c-kube-api-access-7mmg9\") pod \"keystone-operator-controller-manager-58b8dcc5fb-t5dvt\" (UID: \"28196001-edc5-4152-830f-7712255d742c\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:19.293896 master-0 kubenswrapper[29252]: I1203 20:24:19.274853 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwwz4\" (UniqueName: \"kubernetes.io/projected/64a91869-6d5d-4f4a-8f51-9eab613c4b13-kube-api-access-lwwz4\") pod \"manila-operator-controller-manager-56f9fbf74b-q66vk\" (UID: \"64a91869-6d5d-4f4a-8f51-9eab613c4b13\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:19.293896 master-0 kubenswrapper[29252]: I1203 20:24:19.276404 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:19.318647 master-0 kubenswrapper[29252]: I1203 20:24:19.317261 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk"] Dec 03 20:24:19.318647 master-0 kubenswrapper[29252]: I1203 20:24:19.317397 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:19.339673 master-0 kubenswrapper[29252]: I1203 20:24:19.339624 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258"] Dec 03 20:24:19.367499 master-0 kubenswrapper[29252]: I1203 20:24:19.367460 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6"] Dec 03 20:24:19.369016 master-0 kubenswrapper[29252]: I1203 20:24:19.368992 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:19.381556 master-0 kubenswrapper[29252]: I1203 20:24:19.381510 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6"] Dec 03 20:24:19.382271 master-0 kubenswrapper[29252]: I1203 20:24:19.382216 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmg9\" (UniqueName: \"kubernetes.io/projected/28196001-edc5-4152-830f-7712255d742c-kube-api-access-7mmg9\") pod \"keystone-operator-controller-manager-58b8dcc5fb-t5dvt\" (UID: \"28196001-edc5-4152-830f-7712255d742c\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:19.382335 master-0 kubenswrapper[29252]: I1203 20:24:19.382313 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwwz4\" (UniqueName: \"kubernetes.io/projected/64a91869-6d5d-4f4a-8f51-9eab613c4b13-kube-api-access-lwwz4\") pod \"manila-operator-controller-manager-56f9fbf74b-q66vk\" (UID: \"64a91869-6d5d-4f4a-8f51-9eab613c4b13\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:19.382449 master-0 kubenswrapper[29252]: I1203 20:24:19.382412 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqm5n\" (UniqueName: \"kubernetes.io/projected/12773d43-060c-4f0c-8a6c-615a6f577894-kube-api-access-bqm5n\") pod \"mariadb-operator-controller-manager-647d75769b-qw258\" (UID: \"12773d43-060c-4f0c-8a6c-615a6f577894\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:19.398044 master-0 kubenswrapper[29252]: I1203 20:24:19.397997 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg"] Dec 03 20:24:19.399726 master-0 kubenswrapper[29252]: I1203 20:24:19.399695 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:19.410089 master-0 kubenswrapper[29252]: I1203 20:24:19.410042 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj"] Dec 03 20:24:19.411440 master-0 kubenswrapper[29252]: I1203 20:24:19.411408 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:19.430913 master-0 kubenswrapper[29252]: I1203 20:24:19.424502 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:19.437555 master-0 kubenswrapper[29252]: I1203 20:24:19.436185 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmg9\" (UniqueName: \"kubernetes.io/projected/28196001-edc5-4152-830f-7712255d742c-kube-api-access-7mmg9\") pod \"keystone-operator-controller-manager-58b8dcc5fb-t5dvt\" (UID: \"28196001-edc5-4152-830f-7712255d742c\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:19.448765 master-0 kubenswrapper[29252]: I1203 20:24:19.448723 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwwz4\" (UniqueName: \"kubernetes.io/projected/64a91869-6d5d-4f4a-8f51-9eab613c4b13-kube-api-access-lwwz4\") pod \"manila-operator-controller-manager-56f9fbf74b-q66vk\" (UID: \"64a91869-6d5d-4f4a-8f51-9eab613c4b13\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:19.492331 master-0 kubenswrapper[29252]: I1203 20:24:19.490876 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg"] Dec 03 20:24:19.492331 master-0 kubenswrapper[29252]: I1203 20:24:19.490919 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj"] Dec 03 20:24:19.492331 master-0 kubenswrapper[29252]: I1203 20:24:19.490934 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-h76kk"] Dec 03 20:24:19.492766 master-0 kubenswrapper[29252]: I1203 20:24:19.492737 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp"] Dec 03 20:24:19.493739 master-0 kubenswrapper[29252]: I1203 20:24:19.493709 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54"] Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.494901 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.495398 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.495485 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.496849 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqm5n\" (UniqueName: \"kubernetes.io/projected/12773d43-060c-4f0c-8a6c-615a6f577894-kube-api-access-bqm5n\") pod \"mariadb-operator-controller-manager-647d75769b-qw258\" (UID: \"12773d43-060c-4f0c-8a6c-615a6f577894\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.496906 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9rc\" (UniqueName: \"kubernetes.io/projected/d3595782-c6a7-4f72-99fb-44f3a68f1f6d-kube-api-access-qv9rc\") pod \"neutron-operator-controller-manager-7cdd6b54fb-5fcb6\" (UID: \"d3595782-c6a7-4f72-99fb-44f3a68f1f6d\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.496998 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s65p\" (UniqueName: \"kubernetes.io/projected/d27a7a24-0257-494c-9cc6-889c8a971e81-kube-api-access-2s65p\") pod \"nova-operator-controller-manager-865fc86d5b-twshg\" (UID: \"d27a7a24-0257-494c-9cc6-889c8a971e81\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:19.497088 master-0 kubenswrapper[29252]: I1203 20:24:19.497062 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhhg\" (UniqueName: \"kubernetes.io/projected/12a31e7e-9fbb-49e1-b779-6050f8898ce3-kube-api-access-lnhhg\") pod \"octavia-operator-controller-manager-845b79dc4f-kp2nj\" (UID: \"12a31e7e-9fbb-49e1-b779-6050f8898ce3\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:19.498560 master-0 kubenswrapper[29252]: I1203 20:24:19.498211 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 20:24:19.533797 master-0 kubenswrapper[29252]: I1203 20:24:19.533742 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-h76kk"] Dec 03 20:24:19.540528 master-0 kubenswrapper[29252]: I1203 20:24:19.540461 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54"] Dec 03 20:24:19.548977 master-0 kubenswrapper[29252]: I1203 20:24:19.548313 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp"] Dec 03 20:24:19.556555 master-0 kubenswrapper[29252]: I1203 20:24:19.556171 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqm5n\" (UniqueName: \"kubernetes.io/projected/12773d43-060c-4f0c-8a6c-615a6f577894-kube-api-access-bqm5n\") pod \"mariadb-operator-controller-manager-647d75769b-qw258\" (UID: \"12773d43-060c-4f0c-8a6c-615a6f577894\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:19.565276 master-0 kubenswrapper[29252]: I1203 20:24:19.565224 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-z7xsv"] Dec 03 20:24:19.567289 master-0 kubenswrapper[29252]: I1203 20:24:19.567070 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:19.571017 master-0 kubenswrapper[29252]: I1203 20:24:19.570976 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts"] Dec 03 20:24:19.576280 master-0 kubenswrapper[29252]: I1203 20:24:19.575803 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:19.582846 master-0 kubenswrapper[29252]: I1203 20:24:19.579966 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm"] Dec 03 20:24:19.582846 master-0 kubenswrapper[29252]: I1203 20:24:19.581900 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:19.585085 master-0 kubenswrapper[29252]: I1203 20:24:19.585043 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-z7xsv"] Dec 03 20:24:19.602811 master-0 kubenswrapper[29252]: I1203 20:24:19.602726 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s65p\" (UniqueName: \"kubernetes.io/projected/d27a7a24-0257-494c-9cc6-889c8a971e81-kube-api-access-2s65p\") pod \"nova-operator-controller-manager-865fc86d5b-twshg\" (UID: \"d27a7a24-0257-494c-9cc6-889c8a971e81\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.602837 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgvl\" (UniqueName: \"kubernetes.io/projected/eb78027d-6293-4a9a-961a-d4b57eb0e5f5-kube-api-access-5zgvl\") pod \"placement-operator-controller-manager-6b64f6f645-vvm54\" (UID: \"eb78027d-6293-4a9a-961a-d4b57eb0e5f5\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.602881 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhhg\" (UniqueName: \"kubernetes.io/projected/12a31e7e-9fbb-49e1-b779-6050f8898ce3-kube-api-access-lnhhg\") pod \"octavia-operator-controller-manager-845b79dc4f-kp2nj\" (UID: \"12a31e7e-9fbb-49e1-b779-6050f8898ce3\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.602911 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85n5p\" (UniqueName: \"kubernetes.io/projected/b08561ad-441a-4ed6-b8d2-4af65531b047-kube-api-access-85n5p\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.602965 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.603008 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9rc\" (UniqueName: \"kubernetes.io/projected/d3595782-c6a7-4f72-99fb-44f3a68f1f6d-kube-api-access-qv9rc\") pod \"neutron-operator-controller-manager-7cdd6b54fb-5fcb6\" (UID: \"d3595782-c6a7-4f72-99fb-44f3a68f1f6d\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:19.603062 master-0 kubenswrapper[29252]: I1203 20:24:19.603061 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkfj\" (UniqueName: \"kubernetes.io/projected/2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0-kube-api-access-tqkfj\") pod \"ovn-operator-controller-manager-647f96877-h76kk\" (UID: \"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:19.621038 master-0 kubenswrapper[29252]: I1203 20:24:19.620631 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts"] Dec 03 20:24:19.629795 master-0 kubenswrapper[29252]: I1203 20:24:19.629468 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhhg\" (UniqueName: \"kubernetes.io/projected/12a31e7e-9fbb-49e1-b779-6050f8898ce3-kube-api-access-lnhhg\") pod \"octavia-operator-controller-manager-845b79dc4f-kp2nj\" (UID: \"12a31e7e-9fbb-49e1-b779-6050f8898ce3\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:19.630644 master-0 kubenswrapper[29252]: I1203 20:24:19.630615 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s65p\" (UniqueName: \"kubernetes.io/projected/d27a7a24-0257-494c-9cc6-889c8a971e81-kube-api-access-2s65p\") pod \"nova-operator-controller-manager-865fc86d5b-twshg\" (UID: \"d27a7a24-0257-494c-9cc6-889c8a971e81\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:19.637369 master-0 kubenswrapper[29252]: I1203 20:24:19.632699 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9rc\" (UniqueName: \"kubernetes.io/projected/d3595782-c6a7-4f72-99fb-44f3a68f1f6d-kube-api-access-qv9rc\") pod \"neutron-operator-controller-manager-7cdd6b54fb-5fcb6\" (UID: \"d3595782-c6a7-4f72-99fb-44f3a68f1f6d\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:19.649431 master-0 kubenswrapper[29252]: I1203 20:24:19.649224 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:19.654798 master-0 kubenswrapper[29252]: I1203 20:24:19.649612 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc"] Dec 03 20:24:19.654798 master-0 kubenswrapper[29252]: I1203 20:24:19.652817 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:19.661846 master-0 kubenswrapper[29252]: I1203 20:24:19.661754 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm"] Dec 03 20:24:19.663243 master-0 kubenswrapper[29252]: I1203 20:24:19.663201 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:19.679842 master-0 kubenswrapper[29252]: I1203 20:24:19.670080 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc"] Dec 03 20:24:19.698903 master-0 kubenswrapper[29252]: I1203 20:24:19.693757 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:19.701746 master-0 kubenswrapper[29252]: I1203 20:24:19.701709 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704391 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2z9\" (UniqueName: \"kubernetes.io/projected/7363059f-f7ee-4bb1-a028-f021e2e51f8e-kube-api-access-kz2z9\") pod \"telemetry-operator-controller-manager-7b5867bfc7-lf5ts\" (UID: \"7363059f-f7ee-4bb1-a028-f021e2e51f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704436 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85n5p\" (UniqueName: \"kubernetes.io/projected/b08561ad-441a-4ed6-b8d2-4af65531b047-kube-api-access-85n5p\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704517 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704820 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704892 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvvr\" (UniqueName: \"kubernetes.io/projected/90e0e11c-59df-46fd-9d7e-4c77a66cab18-kube-api-access-hxvvr\") pod \"swift-operator-controller-manager-696b999796-z7xsv\" (UID: \"90e0e11c-59df-46fd-9d7e-4c77a66cab18\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: E1203 20:24:19.704822 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.704951 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkfj\" (UniqueName: \"kubernetes.io/projected/2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0-kube-api-access-tqkfj\") pod \"ovn-operator-controller-manager-647f96877-h76kk\" (UID: \"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: E1203 20:24:19.704986 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:20.204971 +0000 UTC m=+895.018515953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.705000 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c552p\" (UniqueName: \"kubernetes.io/projected/7b7362c6-9cc4-45dd-8a04-614481022860-kube-api-access-c552p\") pod \"test-operator-controller-manager-57dfcdd5b8-tgvnm\" (UID: \"7b7362c6-9cc4-45dd-8a04-614481022860\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: I1203 20:24:19.705024 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgvl\" (UniqueName: \"kubernetes.io/projected/eb78027d-6293-4a9a-961a-d4b57eb0e5f5-kube-api-access-5zgvl\") pod \"placement-operator-controller-manager-6b64f6f645-vvm54\" (UID: \"eb78027d-6293-4a9a-961a-d4b57eb0e5f5\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: E1203 20:24:19.704858 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:19.706322 master-0 kubenswrapper[29252]: E1203 20:24:19.705152 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:20.705143884 +0000 UTC m=+895.518688837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:19.722622 master-0 kubenswrapper[29252]: I1203 20:24:19.714712 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r"] Dec 03 20:24:19.722622 master-0 kubenswrapper[29252]: I1203 20:24:19.716587 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.722622 master-0 kubenswrapper[29252]: I1203 20:24:19.718210 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 20:24:19.722622 master-0 kubenswrapper[29252]: I1203 20:24:19.718480 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 20:24:19.722622 master-0 kubenswrapper[29252]: I1203 20:24:19.721905 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85n5p\" (UniqueName: \"kubernetes.io/projected/b08561ad-441a-4ed6-b8d2-4af65531b047-kube-api-access-85n5p\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:19.729914 master-0 kubenswrapper[29252]: I1203 20:24:19.729878 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgvl\" (UniqueName: \"kubernetes.io/projected/eb78027d-6293-4a9a-961a-d4b57eb0e5f5-kube-api-access-5zgvl\") pod \"placement-operator-controller-manager-6b64f6f645-vvm54\" (UID: \"eb78027d-6293-4a9a-961a-d4b57eb0e5f5\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:19.731421 master-0 kubenswrapper[29252]: I1203 20:24:19.731393 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkfj\" (UniqueName: \"kubernetes.io/projected/2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0-kube-api-access-tqkfj\") pod \"ovn-operator-controller-manager-647f96877-h76kk\" (UID: \"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:19.736912 master-0 kubenswrapper[29252]: I1203 20:24:19.736881 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:19.786205 master-0 kubenswrapper[29252]: I1203 20:24:19.786028 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r"] Dec 03 20:24:19.811819 master-0 kubenswrapper[29252]: I1203 20:24:19.811763 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.811905 master-0 kubenswrapper[29252]: I1203 20:24:19.811845 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvvr\" (UniqueName: \"kubernetes.io/projected/90e0e11c-59df-46fd-9d7e-4c77a66cab18-kube-api-access-hxvvr\") pod \"swift-operator-controller-manager-696b999796-z7xsv\" (UID: \"90e0e11c-59df-46fd-9d7e-4c77a66cab18\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:19.811905 master-0 kubenswrapper[29252]: I1203 20:24:19.811871 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvnfz\" (UniqueName: \"kubernetes.io/projected/0c987116-b442-4fd5-b528-bb2540c8c37c-kube-api-access-wvnfz\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.811983 master-0 kubenswrapper[29252]: I1203 20:24:19.811932 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c552p\" (UniqueName: \"kubernetes.io/projected/7b7362c6-9cc4-45dd-8a04-614481022860-kube-api-access-c552p\") pod \"test-operator-controller-manager-57dfcdd5b8-tgvnm\" (UID: \"7b7362c6-9cc4-45dd-8a04-614481022860\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:19.811983 master-0 kubenswrapper[29252]: I1203 20:24:19.811974 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2z9\" (UniqueName: \"kubernetes.io/projected/7363059f-f7ee-4bb1-a028-f021e2e51f8e-kube-api-access-kz2z9\") pod \"telemetry-operator-controller-manager-7b5867bfc7-lf5ts\" (UID: \"7363059f-f7ee-4bb1-a028-f021e2e51f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:19.812043 master-0 kubenswrapper[29252]: I1203 20:24:19.811997 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gdg\" (UniqueName: \"kubernetes.io/projected/f3ed1633-722c-4440-95db-2b644be51ba9-kube-api-access-29gdg\") pod \"watcher-operator-controller-manager-6b9b669fdb-nbjnc\" (UID: \"f3ed1633-722c-4440-95db-2b644be51ba9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:19.812074 master-0 kubenswrapper[29252]: I1203 20:24:19.812053 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.835025 master-0 kubenswrapper[29252]: I1203 20:24:19.828634 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs"] Dec 03 20:24:19.835025 master-0 kubenswrapper[29252]: I1203 20:24:19.829767 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" Dec 03 20:24:19.835025 master-0 kubenswrapper[29252]: I1203 20:24:19.832204 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:19.835025 master-0 kubenswrapper[29252]: I1203 20:24:19.833493 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c552p\" (UniqueName: \"kubernetes.io/projected/7b7362c6-9cc4-45dd-8a04-614481022860-kube-api-access-c552p\") pod \"test-operator-controller-manager-57dfcdd5b8-tgvnm\" (UID: \"7b7362c6-9cc4-45dd-8a04-614481022860\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:19.836113 master-0 kubenswrapper[29252]: I1203 20:24:19.835263 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2z9\" (UniqueName: \"kubernetes.io/projected/7363059f-f7ee-4bb1-a028-f021e2e51f8e-kube-api-access-kz2z9\") pod \"telemetry-operator-controller-manager-7b5867bfc7-lf5ts\" (UID: \"7363059f-f7ee-4bb1-a028-f021e2e51f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:19.842588 master-0 kubenswrapper[29252]: I1203 20:24:19.842242 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvvr\" (UniqueName: \"kubernetes.io/projected/90e0e11c-59df-46fd-9d7e-4c77a66cab18-kube-api-access-hxvvr\") pod \"swift-operator-controller-manager-696b999796-z7xsv\" (UID: \"90e0e11c-59df-46fd-9d7e-4c77a66cab18\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:19.868523 master-0 kubenswrapper[29252]: I1203 20:24:19.866957 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs"] Dec 03 20:24:19.906392 master-0 kubenswrapper[29252]: I1203 20:24:19.905962 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r"] Dec 03 20:24:19.913456 master-0 kubenswrapper[29252]: I1203 20:24:19.913007 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smsbv\" (UniqueName: \"kubernetes.io/projected/88b03a41-99d3-4be2-913d-6a6ce4ad4b78-kube-api-access-smsbv\") pod \"rabbitmq-cluster-operator-manager-78955d896f-nlszs\" (UID: \"88b03a41-99d3-4be2-913d-6a6ce4ad4b78\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" Dec 03 20:24:19.913456 master-0 kubenswrapper[29252]: I1203 20:24:19.913074 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29gdg\" (UniqueName: \"kubernetes.io/projected/f3ed1633-722c-4440-95db-2b644be51ba9-kube-api-access-29gdg\") pod \"watcher-operator-controller-manager-6b9b669fdb-nbjnc\" (UID: \"f3ed1633-722c-4440-95db-2b644be51ba9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:19.913456 master-0 kubenswrapper[29252]: I1203 20:24:19.913129 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.913456 master-0 kubenswrapper[29252]: I1203 20:24:19.913188 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.913456 master-0 kubenswrapper[29252]: I1203 20:24:19.913229 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvnfz\" (UniqueName: \"kubernetes.io/projected/0c987116-b442-4fd5-b528-bb2540c8c37c-kube-api-access-wvnfz\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.916888 master-0 kubenswrapper[29252]: E1203 20:24:19.913805 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:19.916888 master-0 kubenswrapper[29252]: E1203 20:24:19.913847 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:20.41383209 +0000 UTC m=+895.227377043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:19.916888 master-0 kubenswrapper[29252]: E1203 20:24:19.913999 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:19.916888 master-0 kubenswrapper[29252]: E1203 20:24:19.914023 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:20.414015135 +0000 UTC m=+895.227560088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:19.936706 master-0 kubenswrapper[29252]: I1203 20:24:19.936640 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvnfz\" (UniqueName: \"kubernetes.io/projected/0c987116-b442-4fd5-b528-bb2540c8c37c-kube-api-access-wvnfz\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:19.937100 master-0 kubenswrapper[29252]: I1203 20:24:19.937079 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gdg\" (UniqueName: \"kubernetes.io/projected/f3ed1633-722c-4440-95db-2b644be51ba9-kube-api-access-29gdg\") pod \"watcher-operator-controller-manager-6b9b669fdb-nbjnc\" (UID: \"f3ed1633-722c-4440-95db-2b644be51ba9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:19.946825 master-0 kubenswrapper[29252]: I1203 20:24:19.946553 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:19.966510 master-0 kubenswrapper[29252]: I1203 20:24:19.966278 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:20.013386 master-0 kubenswrapper[29252]: I1203 20:24:20.013323 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:20.019477 master-0 kubenswrapper[29252]: I1203 20:24:20.017720 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smsbv\" (UniqueName: \"kubernetes.io/projected/88b03a41-99d3-4be2-913d-6a6ce4ad4b78-kube-api-access-smsbv\") pod \"rabbitmq-cluster-operator-manager-78955d896f-nlszs\" (UID: \"88b03a41-99d3-4be2-913d-6a6ce4ad4b78\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" Dec 03 20:24:20.040355 master-0 kubenswrapper[29252]: I1203 20:24:20.039687 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:20.044822 master-0 kubenswrapper[29252]: I1203 20:24:20.044417 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smsbv\" (UniqueName: \"kubernetes.io/projected/88b03a41-99d3-4be2-913d-6a6ce4ad4b78-kube-api-access-smsbv\") pod \"rabbitmq-cluster-operator-manager-78955d896f-nlszs\" (UID: \"88b03a41-99d3-4be2-913d-6a6ce4ad4b78\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" Dec 03 20:24:20.049975 master-0 kubenswrapper[29252]: I1203 20:24:20.049827 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:20.065672 master-0 kubenswrapper[29252]: I1203 20:24:20.065606 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:20.071000 master-0 kubenswrapper[29252]: I1203 20:24:20.070792 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" Dec 03 20:24:20.144760 master-0 kubenswrapper[29252]: I1203 20:24:20.144696 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8"] Dec 03 20:24:20.157034 master-0 kubenswrapper[29252]: I1203 20:24:20.156974 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9"] Dec 03 20:24:20.225541 master-0 kubenswrapper[29252]: I1203 20:24:20.222179 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" event={"ID":"4bfaaa2e-15b3-40fb-93c2-994c4a38559d","Type":"ContainerStarted","Data":"3fbc825888c8ffeeda8fae7c5402d13b2e1ad9308f2764b2011b2c2bceb09d07"} Dec 03 20:24:20.230029 master-0 kubenswrapper[29252]: I1203 20:24:20.228276 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:20.230029 master-0 kubenswrapper[29252]: E1203 20:24:20.228754 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:20.230029 master-0 kubenswrapper[29252]: E1203 20:24:20.228930 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:21.228913181 +0000 UTC m=+896.042458134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:20.233029 master-0 kubenswrapper[29252]: I1203 20:24:20.232713 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" event={"ID":"b6ca362c-809a-47d1-8b68-9848967d382a","Type":"ContainerStarted","Data":"a2e300034489b28f47d7dc45371eaf4b63c6e8113bad641ef7270d20eb588598"} Dec 03 20:24:20.234157 master-0 kubenswrapper[29252]: I1203 20:24:20.234130 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" event={"ID":"c2fed802-28a0-40d3-b422-581c334d8bc5","Type":"ContainerStarted","Data":"68a9be59d10ad644eecf34977bd0e1f9ab7e8096e9d1ff5c6fb012f3d9eaa2cc"} Dec 03 20:24:20.387450 master-0 kubenswrapper[29252]: I1203 20:24:20.387386 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8"] Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: W1203 20:24:20.428930 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f965ddc_ff13_4f8e_b20c_aad918a7be33.slice/crio-6d18faeb19b8dec5d46e6f561ad2d8c145823b88ebbf30478ab312e8713fb5e0 WatchSource:0}: Error finding container 6d18faeb19b8dec5d46e6f561ad2d8c145823b88ebbf30478ab312e8713fb5e0: Status 404 returned error can't find the container with id 6d18faeb19b8dec5d46e6f561ad2d8c145823b88ebbf30478ab312e8713fb5e0 Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: I1203 20:24:20.434161 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: I1203 20:24:20.434252 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: E1203 20:24:20.434514 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: E1203 20:24:20.434572 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:21.434554032 +0000 UTC m=+896.248099005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: E1203 20:24:20.435002 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:20.446308 master-0 kubenswrapper[29252]: E1203 20:24:20.435039 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:21.435028494 +0000 UTC m=+896.248573457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:20.708838 master-0 kubenswrapper[29252]: I1203 20:24:20.706592 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt"] Dec 03 20:24:20.716172 master-0 kubenswrapper[29252]: I1203 20:24:20.716095 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w"] Dec 03 20:24:20.741089 master-0 kubenswrapper[29252]: I1203 20:24:20.740329 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:20.741089 master-0 kubenswrapper[29252]: E1203 20:24:20.740544 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:20.741089 master-0 kubenswrapper[29252]: E1203 20:24:20.740598 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:22.740581011 +0000 UTC m=+897.554125974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:20.751163 master-0 kubenswrapper[29252]: I1203 20:24:20.751115 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm"] Dec 03 20:24:20.763744 master-0 kubenswrapper[29252]: W1203 20:24:20.763677 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7bb0ae_4a5d_4196_a340_51fca6907f3a.slice/crio-47a690646f4cc839b4522240e5e6e529087a7f863469a8c0c7e3d1e615c45ca8 WatchSource:0}: Error finding container 47a690646f4cc839b4522240e5e6e529087a7f863469a8c0c7e3d1e615c45ca8: Status 404 returned error can't find the container with id 47a690646f4cc839b4522240e5e6e529087a7f863469a8c0c7e3d1e615c45ca8 Dec 03 20:24:20.764605 master-0 kubenswrapper[29252]: I1203 20:24:20.764424 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh"] Dec 03 20:24:21.154888 master-0 kubenswrapper[29252]: I1203 20:24:21.135658 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258"] Dec 03 20:24:21.154888 master-0 kubenswrapper[29252]: W1203 20:24:21.142067 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27a7a24_0257_494c_9cc6_889c8a971e81.slice/crio-444c7c3b73cb0233989c5ece49358677ab9faa5ca28b732557c83511fc78ac2e WatchSource:0}: Error finding container 444c7c3b73cb0233989c5ece49358677ab9faa5ca28b732557c83511fc78ac2e: Status 404 returned error can't find the container with id 444c7c3b73cb0233989c5ece49358677ab9faa5ca28b732557c83511fc78ac2e Dec 03 20:24:21.154888 master-0 kubenswrapper[29252]: I1203 20:24:21.153827 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj"] Dec 03 20:24:21.164069 master-0 kubenswrapper[29252]: I1203 20:24:21.164025 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg"] Dec 03 20:24:21.177873 master-0 kubenswrapper[29252]: I1203 20:24:21.177829 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6"] Dec 03 20:24:21.194384 master-0 kubenswrapper[29252]: I1203 20:24:21.194332 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk"] Dec 03 20:24:21.243249 master-0 kubenswrapper[29252]: I1203 20:24:21.243204 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" event={"ID":"12a31e7e-9fbb-49e1-b779-6050f8898ce3","Type":"ContainerStarted","Data":"28c7a116200587e1b610020f936a90a0dbd33fed23e6019e5bc362cf03b6e479"} Dec 03 20:24:21.245154 master-0 kubenswrapper[29252]: I1203 20:24:21.245129 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" event={"ID":"d3595782-c6a7-4f72-99fb-44f3a68f1f6d","Type":"ContainerStarted","Data":"ed2163f36f0174fb5a3720e6f4a39859833208a4d0ea696ae0c3fb00ff93b09d"} Dec 03 20:24:21.246236 master-0 kubenswrapper[29252]: I1203 20:24:21.246186 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" event={"ID":"7f965ddc-ff13-4f8e-b20c-aad918a7be33","Type":"ContainerStarted","Data":"6d18faeb19b8dec5d46e6f561ad2d8c145823b88ebbf30478ab312e8713fb5e0"} Dec 03 20:24:21.247319 master-0 kubenswrapper[29252]: I1203 20:24:21.247297 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" event={"ID":"64a91869-6d5d-4f4a-8f51-9eab613c4b13","Type":"ContainerStarted","Data":"fb55204aad6f60c2735feaf8ff3f7c5716398a1ad5efb703c289507eb7c88029"} Dec 03 20:24:21.250231 master-0 kubenswrapper[29252]: I1203 20:24:21.250171 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" event={"ID":"d27a7a24-0257-494c-9cc6-889c8a971e81","Type":"ContainerStarted","Data":"444c7c3b73cb0233989c5ece49358677ab9faa5ca28b732557c83511fc78ac2e"} Dec 03 20:24:21.252886 master-0 kubenswrapper[29252]: I1203 20:24:21.252832 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" event={"ID":"7d7bb0ae-4a5d-4196-a340-51fca6907f3a","Type":"ContainerStarted","Data":"47a690646f4cc839b4522240e5e6e529087a7f863469a8c0c7e3d1e615c45ca8"} Dec 03 20:24:21.254226 master-0 kubenswrapper[29252]: I1203 20:24:21.254178 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" event={"ID":"4cf77700-7d9a-4d7e-bf0a-71777fa32e55","Type":"ContainerStarted","Data":"e1484013fd0a0e5015428c9fe0c4d3f9e5592b0c469681bd845a36944a7964d8"} Dec 03 20:24:21.255329 master-0 kubenswrapper[29252]: I1203 20:24:21.255293 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" event={"ID":"12773d43-060c-4f0c-8a6c-615a6f577894","Type":"ContainerStarted","Data":"7c4a0cad0a4e0437f5f26df2c791824290f07c80f2598b191ad922a89ce574c1"} Dec 03 20:24:21.256102 master-0 kubenswrapper[29252]: I1203 20:24:21.256064 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:21.256374 master-0 kubenswrapper[29252]: E1203 20:24:21.256328 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:21.256441 master-0 kubenswrapper[29252]: E1203 20:24:21.256394 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:23.256378982 +0000 UTC m=+898.069923935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:21.257347 master-0 kubenswrapper[29252]: I1203 20:24:21.257306 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" event={"ID":"615341bc-bf59-4b24-9baa-3223edd30ad0","Type":"ContainerStarted","Data":"02e651b3a0b4a02f7cbfab9ae90fbd24cc16e317683786b342219e680d72eb36"} Dec 03 20:24:21.258733 master-0 kubenswrapper[29252]: I1203 20:24:21.258707 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" event={"ID":"28196001-edc5-4152-830f-7712255d742c","Type":"ContainerStarted","Data":"2691183452c594f3a9a66da9f5a66fc2951d9d81fce3d08575d1cebd8ec75b6e"} Dec 03 20:24:21.460275 master-0 kubenswrapper[29252]: I1203 20:24:21.460221 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:21.460508 master-0 kubenswrapper[29252]: I1203 20:24:21.460302 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:21.460508 master-0 kubenswrapper[29252]: E1203 20:24:21.460425 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:21.460508 master-0 kubenswrapper[29252]: E1203 20:24:21.460494 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:23.460476107 +0000 UTC m=+898.274021060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:21.460508 master-0 kubenswrapper[29252]: E1203 20:24:21.460500 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:21.460691 master-0 kubenswrapper[29252]: E1203 20:24:21.460654 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:23.460536969 +0000 UTC m=+898.274082032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:21.826831 master-0 kubenswrapper[29252]: I1203 20:24:21.819482 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts"] Dec 03 20:24:21.865651 master-0 kubenswrapper[29252]: I1203 20:24:21.865076 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54"] Dec 03 20:24:21.879323 master-0 kubenswrapper[29252]: I1203 20:24:21.879270 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-z7xsv"] Dec 03 20:24:21.890845 master-0 kubenswrapper[29252]: I1203 20:24:21.890698 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-h76kk"] Dec 03 20:24:21.898637 master-0 kubenswrapper[29252]: I1203 20:24:21.898528 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs"] Dec 03 20:24:21.908759 master-0 kubenswrapper[29252]: I1203 20:24:21.908701 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm"] Dec 03 20:24:21.921711 master-0 kubenswrapper[29252]: I1203 20:24:21.921360 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc"] Dec 03 20:24:22.831328 master-0 kubenswrapper[29252]: I1203 20:24:22.831267 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:22.831875 master-0 kubenswrapper[29252]: E1203 20:24:22.831474 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:22.831875 master-0 kubenswrapper[29252]: E1203 20:24:22.831531 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:26.831515774 +0000 UTC m=+901.645060717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:22.991399 master-0 kubenswrapper[29252]: W1203 20:24:22.991348 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e0e11c_59df_46fd_9d7e_4c77a66cab18.slice/crio-141a5797815ba65ba7d6904614e7e60eb1f61ee62a291df08eda4c59bb058fae WatchSource:0}: Error finding container 141a5797815ba65ba7d6904614e7e60eb1f61ee62a291df08eda4c59bb058fae: Status 404 returned error can't find the container with id 141a5797815ba65ba7d6904614e7e60eb1f61ee62a291df08eda4c59bb058fae Dec 03 20:24:23.288580 master-0 kubenswrapper[29252]: I1203 20:24:23.288444 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" event={"ID":"90e0e11c-59df-46fd-9d7e-4c77a66cab18","Type":"ContainerStarted","Data":"141a5797815ba65ba7d6904614e7e60eb1f61ee62a291df08eda4c59bb058fae"} Dec 03 20:24:23.340668 master-0 kubenswrapper[29252]: I1203 20:24:23.340615 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:23.340926 master-0 kubenswrapper[29252]: E1203 20:24:23.340803 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:23.340926 master-0 kubenswrapper[29252]: E1203 20:24:23.340865 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:27.340850328 +0000 UTC m=+902.154395281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:23.543269 master-0 kubenswrapper[29252]: I1203 20:24:23.543126 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:23.543486 master-0 kubenswrapper[29252]: I1203 20:24:23.543292 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:23.543486 master-0 kubenswrapper[29252]: E1203 20:24:23.543308 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:23.543486 master-0 kubenswrapper[29252]: E1203 20:24:23.543431 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:23.543486 master-0 kubenswrapper[29252]: E1203 20:24:23.543435 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:27.543416666 +0000 UTC m=+902.356961609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:23.543486 master-0 kubenswrapper[29252]: E1203 20:24:23.543490 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:27.543476827 +0000 UTC m=+902.357021770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:26.839867 master-0 kubenswrapper[29252]: I1203 20:24:26.839752 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:26.840547 master-0 kubenswrapper[29252]: E1203 20:24:26.839975 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:26.840547 master-0 kubenswrapper[29252]: E1203 20:24:26.840070 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:34.840049425 +0000 UTC m=+909.653594378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:27.349819 master-0 kubenswrapper[29252]: I1203 20:24:27.349715 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:27.350153 master-0 kubenswrapper[29252]: E1203 20:24:27.350008 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:27.350153 master-0 kubenswrapper[29252]: E1203 20:24:27.350059 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:35.350043076 +0000 UTC m=+910.163588029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:27.555463 master-0 kubenswrapper[29252]: I1203 20:24:27.555402 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:27.555685 master-0 kubenswrapper[29252]: E1203 20:24:27.555571 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:27.555685 master-0 kubenswrapper[29252]: I1203 20:24:27.555608 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:27.555685 master-0 kubenswrapper[29252]: E1203 20:24:27.555638 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:35.555619305 +0000 UTC m=+910.369164268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:27.555876 master-0 kubenswrapper[29252]: E1203 20:24:27.555747 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:27.555876 master-0 kubenswrapper[29252]: E1203 20:24:27.555850 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:35.55582902 +0000 UTC m=+910.369373973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:32.925248 master-0 kubenswrapper[29252]: W1203 20:24:32.925175 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7362c6_9cc4_45dd_8a04_614481022860.slice/crio-16094909c110c7df4078d655670ddafea95996d4ea2f7c4eb8292b913f2ef9ca WatchSource:0}: Error finding container 16094909c110c7df4078d655670ddafea95996d4ea2f7c4eb8292b913f2ef9ca: Status 404 returned error can't find the container with id 16094909c110c7df4078d655670ddafea95996d4ea2f7c4eb8292b913f2ef9ca Dec 03 20:24:32.930627 master-0 kubenswrapper[29252]: W1203 20:24:32.930579 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb78027d_6293_4a9a_961a_d4b57eb0e5f5.slice/crio-dcf8fd1d0861b931e29255f11ca4d7ad86e61ee823cc99f784da5b18971780ac WatchSource:0}: Error finding container dcf8fd1d0861b931e29255f11ca4d7ad86e61ee823cc99f784da5b18971780ac: Status 404 returned error can't find the container with id dcf8fd1d0861b931e29255f11ca4d7ad86e61ee823cc99f784da5b18971780ac Dec 03 20:24:32.931104 master-0 kubenswrapper[29252]: I1203 20:24:32.931041 29252 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 20:24:33.457460 master-0 kubenswrapper[29252]: I1203 20:24:33.457413 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" event={"ID":"7b7362c6-9cc4-45dd-8a04-614481022860","Type":"ContainerStarted","Data":"16094909c110c7df4078d655670ddafea95996d4ea2f7c4eb8292b913f2ef9ca"} Dec 03 20:24:33.457939 master-0 kubenswrapper[29252]: I1203 20:24:33.457727 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" event={"ID":"eb78027d-6293-4a9a-961a-d4b57eb0e5f5","Type":"ContainerStarted","Data":"dcf8fd1d0861b931e29255f11ca4d7ad86e61ee823cc99f784da5b18971780ac"} Dec 03 20:24:34.158547 master-0 kubenswrapper[29252]: W1203 20:24:34.158419 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ed1633_722c_4440_95db_2b644be51ba9.slice/crio-c9be3a4c5398a5023de5f74e5411417658795a945f4cde30906cdc317d61ebf5 WatchSource:0}: Error finding container c9be3a4c5398a5023de5f74e5411417658795a945f4cde30906cdc317d61ebf5: Status 404 returned error can't find the container with id c9be3a4c5398a5023de5f74e5411417658795a945f4cde30906cdc317d61ebf5 Dec 03 20:24:34.161421 master-0 kubenswrapper[29252]: W1203 20:24:34.161298 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7363059f_f7ee_4bb1_a028_f021e2e51f8e.slice/crio-ce146d46ce6feeb14b9c8c7822d6788e84d3585b0969cf95e31e9051a4246846 WatchSource:0}: Error finding container ce146d46ce6feeb14b9c8c7822d6788e84d3585b0969cf95e31e9051a4246846: Status 404 returned error can't find the container with id ce146d46ce6feeb14b9c8c7822d6788e84d3585b0969cf95e31e9051a4246846 Dec 03 20:24:34.166003 master-0 kubenswrapper[29252]: W1203 20:24:34.165845 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b03a41_99d3_4be2_913d_6a6ce4ad4b78.slice/crio-92ac2e30c26102af58de3ea2ff3335d512d7b8391e1db3e142a81bed2280dc5a WatchSource:0}: Error finding container 92ac2e30c26102af58de3ea2ff3335d512d7b8391e1db3e142a81bed2280dc5a: Status 404 returned error can't find the container with id 92ac2e30c26102af58de3ea2ff3335d512d7b8391e1db3e142a81bed2280dc5a Dec 03 20:24:34.170959 master-0 kubenswrapper[29252]: W1203 20:24:34.169946 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe48ec1_85b8_48dc_b0b5_8f3a5dc91dd0.slice/crio-bb3b904315bb9e3246a1364dab2a9baffc5482167ee2270447fefcd635443064 WatchSource:0}: Error finding container bb3b904315bb9e3246a1364dab2a9baffc5482167ee2270447fefcd635443064: Status 404 returned error can't find the container with id bb3b904315bb9e3246a1364dab2a9baffc5482167ee2270447fefcd635443064 Dec 03 20:24:34.474544 master-0 kubenswrapper[29252]: I1203 20:24:34.474414 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" event={"ID":"88b03a41-99d3-4be2-913d-6a6ce4ad4b78","Type":"ContainerStarted","Data":"92ac2e30c26102af58de3ea2ff3335d512d7b8391e1db3e142a81bed2280dc5a"} Dec 03 20:24:34.475670 master-0 kubenswrapper[29252]: I1203 20:24:34.475629 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" event={"ID":"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0","Type":"ContainerStarted","Data":"bb3b904315bb9e3246a1364dab2a9baffc5482167ee2270447fefcd635443064"} Dec 03 20:24:34.477026 master-0 kubenswrapper[29252]: I1203 20:24:34.476997 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" event={"ID":"f3ed1633-722c-4440-95db-2b644be51ba9","Type":"ContainerStarted","Data":"c9be3a4c5398a5023de5f74e5411417658795a945f4cde30906cdc317d61ebf5"} Dec 03 20:24:34.478257 master-0 kubenswrapper[29252]: I1203 20:24:34.478228 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" event={"ID":"7363059f-f7ee-4bb1-a028-f021e2e51f8e","Type":"ContainerStarted","Data":"ce146d46ce6feeb14b9c8c7822d6788e84d3585b0969cf95e31e9051a4246846"} Dec 03 20:24:34.857943 master-0 kubenswrapper[29252]: I1203 20:24:34.857799 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:34.858149 master-0 kubenswrapper[29252]: E1203 20:24:34.858043 29252 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:34.858149 master-0 kubenswrapper[29252]: E1203 20:24:34.858135 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert podName:e0c29a23-11dd-445c-8ebf-cef7994d7bc3 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:50.858107611 +0000 UTC m=+925.671652584 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-jcxjt" (UID: "e0c29a23-11dd-445c-8ebf-cef7994d7bc3") : secret "infra-operator-webhook-server-cert" not found Dec 03 20:24:35.368553 master-0 kubenswrapper[29252]: I1203 20:24:35.368447 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:35.369502 master-0 kubenswrapper[29252]: E1203 20:24:35.368673 29252 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:35.369502 master-0 kubenswrapper[29252]: E1203 20:24:35.368808 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert podName:b08561ad-441a-4ed6-b8d2-4af65531b047 nodeName:}" failed. No retries permitted until 2025-12-03 20:24:51.368752998 +0000 UTC m=+926.182297961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" (UID: "b08561ad-441a-4ed6-b8d2-4af65531b047") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 20:24:35.573170 master-0 kubenswrapper[29252]: I1203 20:24:35.573084 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:35.573397 master-0 kubenswrapper[29252]: I1203 20:24:35.573218 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:35.573397 master-0 kubenswrapper[29252]: E1203 20:24:35.573225 29252 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 20:24:35.573397 master-0 kubenswrapper[29252]: E1203 20:24:35.573266 29252 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 20:24:35.573529 master-0 kubenswrapper[29252]: E1203 20:24:35.573394 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:51.573377685 +0000 UTC m=+926.386922638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "webhook-server-cert" not found Dec 03 20:24:35.573770 master-0 kubenswrapper[29252]: E1203 20:24:35.573737 29252 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs podName:0c987116-b442-4fd5-b528-bb2540c8c37c nodeName:}" failed. No retries permitted until 2025-12-03 20:24:51.573494238 +0000 UTC m=+926.387039191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-g442r" (UID: "0c987116-b442-4fd5-b528-bb2540c8c37c") : secret "metrics-server-cert" not found Dec 03 20:24:42.573258 master-0 kubenswrapper[29252]: I1203 20:24:42.573206 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" event={"ID":"d3595782-c6a7-4f72-99fb-44f3a68f1f6d","Type":"ContainerStarted","Data":"3049c6d2e7b93cb32cbcb3441b05056e52ec3f3ba7f5f9493de7985af6dbcf25"} Dec 03 20:24:42.583090 master-0 kubenswrapper[29252]: I1203 20:24:42.582241 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" event={"ID":"12773d43-060c-4f0c-8a6c-615a6f577894","Type":"ContainerStarted","Data":"2528ae3f1418c4fc90960a43130a2c82ee2498e961af6201dcc4567b70f65be7"} Dec 03 20:24:42.598328 master-0 kubenswrapper[29252]: I1203 20:24:42.597626 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" event={"ID":"7f965ddc-ff13-4f8e-b20c-aad918a7be33","Type":"ContainerStarted","Data":"8a829c2983a5d5052912e548064e7b6ccc74f82a8e1ad93c5a6bf826c85b5a78"} Dec 03 20:24:42.601992 master-0 kubenswrapper[29252]: I1203 20:24:42.601944 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" event={"ID":"d27a7a24-0257-494c-9cc6-889c8a971e81","Type":"ContainerStarted","Data":"62c3b94548a7e9692ea332826669254f537e3d7ead2a664fbf21fa7fb075217d"} Dec 03 20:24:42.611553 master-0 kubenswrapper[29252]: I1203 20:24:42.611485 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" event={"ID":"7d7bb0ae-4a5d-4196-a340-51fca6907f3a","Type":"ContainerStarted","Data":"a7f0abd3d3d8de851ccc72f48963f9f7f71d89fa48423d122f3dca5ed7636521"} Dec 03 20:24:42.615873 master-0 kubenswrapper[29252]: I1203 20:24:42.615832 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" event={"ID":"c2fed802-28a0-40d3-b422-581c334d8bc5","Type":"ContainerStarted","Data":"ff4a610d69969f7f8128cdce321cbfa8fdd94767a7767655ea62c378f4345c16"} Dec 03 20:24:42.617820 master-0 kubenswrapper[29252]: I1203 20:24:42.617791 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" event={"ID":"12a31e7e-9fbb-49e1-b779-6050f8898ce3","Type":"ContainerStarted","Data":"37b9b0fa103132350295e239893b094066d9780f6df750cb25a0ac027ea31170"} Dec 03 20:24:43.072890 master-0 kubenswrapper[29252]: E1203 20:24:43.066630 29252 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-29gdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9b669fdb-nbjnc_openstack-operators(f3ed1633-722c-4440-95db-2b644be51ba9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 20:24:43.072890 master-0 kubenswrapper[29252]: E1203 20:24:43.069262 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" podUID="f3ed1633-722c-4440-95db-2b644be51ba9" Dec 03 20:24:43.100499 master-0 kubenswrapper[29252]: E1203 20:24:43.087338 29252 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzbs4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-5cd89994b5-78ft8_openstack-operators(b6ca362c-809a-47d1-8b68-9848967d382a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 20:24:43.100499 master-0 kubenswrapper[29252]: E1203 20:24:43.089736 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" podUID="b6ca362c-809a-47d1-8b68-9848967d382a" Dec 03 20:24:43.219186 master-0 kubenswrapper[29252]: E1203 20:24:43.207223 29252 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxvvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-696b999796-z7xsv_openstack-operators(90e0e11c-59df-46fd-9d7e-4c77a66cab18): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 20:24:43.219186 master-0 kubenswrapper[29252]: E1203 20:24:43.208403 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" podUID="90e0e11c-59df-46fd-9d7e-4c77a66cab18" Dec 03 20:24:43.219186 master-0 kubenswrapper[29252]: E1203 20:24:43.217018 29252 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c552p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-57dfcdd5b8-tgvnm_openstack-operators(7b7362c6-9cc4-45dd-8a04-614481022860): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 20:24:43.219186 master-0 kubenswrapper[29252]: E1203 20:24:43.218224 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" podUID="7b7362c6-9cc4-45dd-8a04-614481022860" Dec 03 20:24:43.219461 master-0 kubenswrapper[29252]: E1203 20:24:43.219309 29252 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqkfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-h76kk_openstack-operators(2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 20:24:43.224057 master-0 kubenswrapper[29252]: E1203 20:24:43.220477 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" podUID="2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0" Dec 03 20:24:43.643798 master-0 kubenswrapper[29252]: I1203 20:24:43.640155 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" event={"ID":"4bfaaa2e-15b3-40fb-93c2-994c4a38559d","Type":"ContainerStarted","Data":"200a1ba6714baca4c4bec70ab17a6908bb81495828fb6f4b4e0139557a022453"} Dec 03 20:24:43.658828 master-0 kubenswrapper[29252]: I1203 20:24:43.653987 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" event={"ID":"7b7362c6-9cc4-45dd-8a04-614481022860","Type":"ContainerStarted","Data":"5b593abbfb88e662c3ec83651bcf5a747c5ca1169e65edb65dfdcf77b5a85e43"} Dec 03 20:24:43.658828 master-0 kubenswrapper[29252]: I1203 20:24:43.654304 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:43.660340 master-0 kubenswrapper[29252]: E1203 20:24:43.659551 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" podUID="7b7362c6-9cc4-45dd-8a04-614481022860" Dec 03 20:24:43.663794 master-0 kubenswrapper[29252]: I1203 20:24:43.661463 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" event={"ID":"88b03a41-99d3-4be2-913d-6a6ce4ad4b78","Type":"ContainerStarted","Data":"5f6f6a47d77536f86b388f7869953dd16e17dae5b1acfe3cad66fe7eae09b43e"} Dec 03 20:24:43.681808 master-0 kubenswrapper[29252]: I1203 20:24:43.676975 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" event={"ID":"eb78027d-6293-4a9a-961a-d4b57eb0e5f5","Type":"ContainerStarted","Data":"506f29f2cbc290f39d29891fe431184441aa02e158fa1579e6c6450f052b5d12"} Dec 03 20:24:43.701796 master-0 kubenswrapper[29252]: I1203 20:24:43.698754 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" event={"ID":"615341bc-bf59-4b24-9baa-3223edd30ad0","Type":"ContainerStarted","Data":"33e0733c780f5fdbee44875dffa2286e9846958286a8483878457c97fd58f9e3"} Dec 03 20:24:43.720810 master-0 kubenswrapper[29252]: I1203 20:24:43.713194 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" event={"ID":"7363059f-f7ee-4bb1-a028-f021e2e51f8e","Type":"ContainerStarted","Data":"fb79be52cba3d8df3f18cccf8fe40bd1502ca56651712dc272ee6c28006a1ac1"} Dec 03 20:24:43.742803 master-0 kubenswrapper[29252]: I1203 20:24:43.742532 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" event={"ID":"28196001-edc5-4152-830f-7712255d742c","Type":"ContainerStarted","Data":"9db62d464a4ef07f6657b900b83238ec143f4cbf42201e7dbb7adb11ed8bfbc3"} Dec 03 20:24:43.745795 master-0 kubenswrapper[29252]: I1203 20:24:43.745019 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" event={"ID":"b6ca362c-809a-47d1-8b68-9848967d382a","Type":"ContainerStarted","Data":"6ebb87c6ae93e72aecd3ab49d99822d5b1d60a415081d13bb58b52df14fd9bcd"} Dec 03 20:24:43.746139 master-0 kubenswrapper[29252]: I1203 20:24:43.746096 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:43.751118 master-0 kubenswrapper[29252]: E1203 20:24:43.751056 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" podUID="b6ca362c-809a-47d1-8b68-9848967d382a" Dec 03 20:24:43.753613 master-0 kubenswrapper[29252]: I1203 20:24:43.753566 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" event={"ID":"64a91869-6d5d-4f4a-8f51-9eab613c4b13","Type":"ContainerStarted","Data":"1012d31eb1ec68ec6000bbae492aa8f9f800cac1a95c3b158294134e2239a047"} Dec 03 20:24:43.791801 master-0 kubenswrapper[29252]: I1203 20:24:43.791003 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" event={"ID":"90e0e11c-59df-46fd-9d7e-4c77a66cab18","Type":"ContainerStarted","Data":"c09133d52d73d57c8350e25331478d630566b4bca75e3280aef944402fbd959a"} Dec 03 20:24:43.813813 master-0 kubenswrapper[29252]: I1203 20:24:43.792559 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:43.813813 master-0 kubenswrapper[29252]: E1203 20:24:43.795799 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" podUID="90e0e11c-59df-46fd-9d7e-4c77a66cab18" Dec 03 20:24:43.813813 master-0 kubenswrapper[29252]: I1203 20:24:43.811016 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" event={"ID":"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0","Type":"ContainerStarted","Data":"4400370a6e559b3a500aaa01ba705d8280e8ed3fc2ebcfc987a0cd94acb24a1a"} Dec 03 20:24:43.813813 master-0 kubenswrapper[29252]: I1203 20:24:43.813498 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:43.827796 master-0 kubenswrapper[29252]: E1203 20:24:43.815067 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" podUID="2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0" Dec 03 20:24:43.827796 master-0 kubenswrapper[29252]: I1203 20:24:43.820152 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-nlszs" podStartSLOduration=16.771550645 podStartE2EDuration="24.820136874s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:34.190015438 +0000 UTC m=+909.003560411" lastFinishedPulling="2025-12-03 20:24:42.238601687 +0000 UTC m=+917.052146640" observedRunningTime="2025-12-03 20:24:43.75966368 +0000 UTC m=+918.573208633" watchObservedRunningTime="2025-12-03 20:24:43.820136874 +0000 UTC m=+918.633681817" Dec 03 20:24:43.847797 master-0 kubenswrapper[29252]: I1203 20:24:43.839907 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" event={"ID":"f3ed1633-722c-4440-95db-2b644be51ba9","Type":"ContainerStarted","Data":"9da4badbc9a5277de6f2b92df89c8b1520eede558500940d6cc66ee845dfc34f"} Dec 03 20:24:43.847797 master-0 kubenswrapper[29252]: I1203 20:24:43.839971 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:43.847797 master-0 kubenswrapper[29252]: E1203 20:24:43.843163 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" podUID="f3ed1633-722c-4440-95db-2b644be51ba9" Dec 03 20:24:43.847797 master-0 kubenswrapper[29252]: I1203 20:24:43.845225 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" event={"ID":"4cf77700-7d9a-4d7e-bf0a-71777fa32e55","Type":"ContainerStarted","Data":"f39eef65f02f5e64bb08daa7f078f37e6b0f29c0c79de425cb6e6f3ccf474dcb"} Dec 03 20:24:44.862388 master-0 kubenswrapper[29252]: E1203 20:24:44.862054 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" podUID="90e0e11c-59df-46fd-9d7e-4c77a66cab18" Dec 03 20:24:44.862945 master-0 kubenswrapper[29252]: E1203 20:24:44.862852 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" podUID="2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0" Dec 03 20:24:44.863326 master-0 kubenswrapper[29252]: E1203 20:24:44.863303 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" podUID="b6ca362c-809a-47d1-8b68-9848967d382a" Dec 03 20:24:44.863560 master-0 kubenswrapper[29252]: E1203 20:24:44.863517 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" podUID="f3ed1633-722c-4440-95db-2b644be51ba9" Dec 03 20:24:44.863612 master-0 kubenswrapper[29252]: E1203 20:24:44.863584 29252 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" podUID="7b7362c6-9cc4-45dd-8a04-614481022860" Dec 03 20:24:47.888507 master-0 kubenswrapper[29252]: I1203 20:24:47.888425 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" event={"ID":"4bfaaa2e-15b3-40fb-93c2-994c4a38559d","Type":"ContainerStarted","Data":"2eaa4da02b0f2ebeb0ce90122bd7fda039ed94ad43b8ad104c3c77a3c1160bf5"} Dec 03 20:24:47.889187 master-0 kubenswrapper[29252]: I1203 20:24:47.888521 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:47.893655 master-0 kubenswrapper[29252]: I1203 20:24:47.893608 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" event={"ID":"c2fed802-28a0-40d3-b422-581c334d8bc5","Type":"ContainerStarted","Data":"4d1b553563bca713e369839b01a26d5d5dabb04e1f38a61a29b4df90d8755b26"} Dec 03 20:24:47.893845 master-0 kubenswrapper[29252]: I1203 20:24:47.893794 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:47.897802 master-0 kubenswrapper[29252]: I1203 20:24:47.894499 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" Dec 03 20:24:47.897802 master-0 kubenswrapper[29252]: I1203 20:24:47.896019 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" event={"ID":"4cf77700-7d9a-4d7e-bf0a-71777fa32e55","Type":"ContainerStarted","Data":"7a83babc78dc4c087df1621b32417fa5429cbf4c5050552e6898644d67bb6e21"} Dec 03 20:24:47.897802 master-0 kubenswrapper[29252]: I1203 20:24:47.896617 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:47.897802 master-0 kubenswrapper[29252]: I1203 20:24:47.896665 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" Dec 03 20:24:47.898094 master-0 kubenswrapper[29252]: I1203 20:24:47.898070 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" Dec 03 20:24:47.901801 master-0 kubenswrapper[29252]: I1203 20:24:47.898303 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" event={"ID":"12773d43-060c-4f0c-8a6c-615a6f577894","Type":"ContainerStarted","Data":"c1c5e953b9c5bfe162ca68ee332a7165a7054453285c1bb7c70f148d7a1dfb06"} Dec 03 20:24:47.901801 master-0 kubenswrapper[29252]: I1203 20:24:47.898510 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:47.901801 master-0 kubenswrapper[29252]: I1203 20:24:47.900181 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" Dec 03 20:24:47.905800 master-0 kubenswrapper[29252]: I1203 20:24:47.902394 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" event={"ID":"615341bc-bf59-4b24-9baa-3223edd30ad0","Type":"ContainerStarted","Data":"bd481f2df116543198ac9fd062ea49c9b2f82e4c3c0de4e269306c717de4a2f3"} Dec 03 20:24:47.905800 master-0 kubenswrapper[29252]: I1203 20:24:47.902696 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:47.905800 master-0 kubenswrapper[29252]: I1203 20:24:47.904466 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" Dec 03 20:24:48.349194 master-0 kubenswrapper[29252]: I1203 20:24:48.349123 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-fsg7r" podStartSLOduration=2.857560033 podStartE2EDuration="30.34910556s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:19.849009221 +0000 UTC m=+894.662554174" lastFinishedPulling="2025-12-03 20:24:47.340554748 +0000 UTC m=+922.154099701" observedRunningTime="2025-12-03 20:24:48.345930443 +0000 UTC m=+923.159475436" watchObservedRunningTime="2025-12-03 20:24:48.34910556 +0000 UTC m=+923.162650503" Dec 03 20:24:48.392403 master-0 kubenswrapper[29252]: I1203 20:24:48.391745 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-qw258" podStartSLOduration=4.398663754 podStartE2EDuration="30.391725529s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:21.17092726 +0000 UTC m=+895.984472213" lastFinishedPulling="2025-12-03 20:24:47.163989025 +0000 UTC m=+921.977533988" observedRunningTime="2025-12-03 20:24:48.380228139 +0000 UTC m=+923.193773112" watchObservedRunningTime="2025-12-03 20:24:48.391725529 +0000 UTC m=+923.205270482" Dec 03 20:24:48.432920 master-0 kubenswrapper[29252]: I1203 20:24:48.432740 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-7lgz9" podStartSLOduration=3.318537868 podStartE2EDuration="30.432721978s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.181686539 +0000 UTC m=+894.995231492" lastFinishedPulling="2025-12-03 20:24:47.295870639 +0000 UTC m=+922.109415602" observedRunningTime="2025-12-03 20:24:48.431800655 +0000 UTC m=+923.245345618" watchObservedRunningTime="2025-12-03 20:24:48.432721978 +0000 UTC m=+923.246266951" Dec 03 20:24:48.485628 master-0 kubenswrapper[29252]: I1203 20:24:48.484965 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-v6spm" podStartSLOduration=3.695959457 podStartE2EDuration="30.484940731s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.746655949 +0000 UTC m=+895.560200912" lastFinishedPulling="2025-12-03 20:24:47.535637243 +0000 UTC m=+922.349182186" observedRunningTime="2025-12-03 20:24:48.466124872 +0000 UTC m=+923.279669835" watchObservedRunningTime="2025-12-03 20:24:48.484940731 +0000 UTC m=+923.298485694" Dec 03 20:24:48.507154 master-0 kubenswrapper[29252]: I1203 20:24:48.506936 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-pnq2w" podStartSLOduration=4.011486857 podStartE2EDuration="30.506865445s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.733316694 +0000 UTC m=+895.546861647" lastFinishedPulling="2025-12-03 20:24:47.228695272 +0000 UTC m=+922.042240235" observedRunningTime="2025-12-03 20:24:48.498284796 +0000 UTC m=+923.311829749" watchObservedRunningTime="2025-12-03 20:24:48.506865445 +0000 UTC m=+923.320410418" Dec 03 20:24:48.921536 master-0 kubenswrapper[29252]: I1203 20:24:48.921079 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" event={"ID":"28196001-edc5-4152-830f-7712255d742c","Type":"ContainerStarted","Data":"6d881e980da47ec9f2d48b28392958ebe3cf47855d70a9eca3754cd9dbeb7055"} Dec 03 20:24:48.921536 master-0 kubenswrapper[29252]: I1203 20:24:48.921497 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:48.924678 master-0 kubenswrapper[29252]: I1203 20:24:48.924553 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" Dec 03 20:24:48.939826 master-0 kubenswrapper[29252]: I1203 20:24:48.939750 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" event={"ID":"d27a7a24-0257-494c-9cc6-889c8a971e81","Type":"ContainerStarted","Data":"fd6dacd7fe452a284cbe6ff2f13d0ec6861948a511e6119625a28afb53a0fca0"} Dec 03 20:24:48.940243 master-0 kubenswrapper[29252]: I1203 20:24:48.940224 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:48.943629 master-0 kubenswrapper[29252]: I1203 20:24:48.943578 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" Dec 03 20:24:48.947519 master-0 kubenswrapper[29252]: I1203 20:24:48.947489 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" event={"ID":"7d7bb0ae-4a5d-4196-a340-51fca6907f3a","Type":"ContainerStarted","Data":"f53a5b094ba34f0b1048d25f2970d7543bdc062e7706b7de86546ff7a86dfbb7"} Dec 03 20:24:48.948096 master-0 kubenswrapper[29252]: I1203 20:24:48.948079 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:48.952997 master-0 kubenswrapper[29252]: I1203 20:24:48.952963 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" Dec 03 20:24:48.958895 master-0 kubenswrapper[29252]: I1203 20:24:48.958806 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-t5dvt" podStartSLOduration=3.267749441 podStartE2EDuration="30.95876314s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.754402658 +0000 UTC m=+895.567947611" lastFinishedPulling="2025-12-03 20:24:48.445416357 +0000 UTC m=+923.258961310" observedRunningTime="2025-12-03 20:24:48.948345755 +0000 UTC m=+923.761890718" watchObservedRunningTime="2025-12-03 20:24:48.95876314 +0000 UTC m=+923.772308123" Dec 03 20:24:49.016602 master-0 kubenswrapper[29252]: I1203 20:24:49.016494 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" Dec 03 20:24:49.052927 master-0 kubenswrapper[29252]: I1203 20:24:49.049267 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-kgmrh" podStartSLOduration=4.010478472 podStartE2EDuration="31.049219424s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.768280396 +0000 UTC m=+895.581825349" lastFinishedPulling="2025-12-03 20:24:47.807021338 +0000 UTC m=+922.620566301" observedRunningTime="2025-12-03 20:24:49.029337929 +0000 UTC m=+923.842882882" watchObservedRunningTime="2025-12-03 20:24:49.049219424 +0000 UTC m=+923.862764397" Dec 03 20:24:49.080418 master-0 kubenswrapper[29252]: I1203 20:24:49.080334 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-twshg" podStartSLOduration=2.6474776650000003 podStartE2EDuration="30.080308232s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:21.172203131 +0000 UTC m=+895.985748084" lastFinishedPulling="2025-12-03 20:24:48.605033708 +0000 UTC m=+923.418578651" observedRunningTime="2025-12-03 20:24:49.064968058 +0000 UTC m=+923.878513011" watchObservedRunningTime="2025-12-03 20:24:49.080308232 +0000 UTC m=+923.893853185" Dec 03 20:24:49.955467 master-0 kubenswrapper[29252]: I1203 20:24:49.955391 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" event={"ID":"7363059f-f7ee-4bb1-a028-f021e2e51f8e","Type":"ContainerStarted","Data":"29b7fea9db7d03817a1b702ac2b72a0813d919aea1d9c3ef7b8f7b0022e705d1"} Dec 03 20:24:49.956089 master-0 kubenswrapper[29252]: I1203 20:24:49.955551 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:49.958020 master-0 kubenswrapper[29252]: I1203 20:24:49.957497 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" event={"ID":"7f965ddc-ff13-4f8e-b20c-aad918a7be33","Type":"ContainerStarted","Data":"8b62ea4021d2e65bdac648555d706d5c4632700fca45940dec3713591cca1b9d"} Dec 03 20:24:49.958020 master-0 kubenswrapper[29252]: I1203 20:24:49.957671 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:49.958020 master-0 kubenswrapper[29252]: I1203 20:24:49.957712 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" Dec 03 20:24:49.959395 master-0 kubenswrapper[29252]: I1203 20:24:49.959259 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" Dec 03 20:24:49.959710 master-0 kubenswrapper[29252]: I1203 20:24:49.959660 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" event={"ID":"b6ca362c-809a-47d1-8b68-9848967d382a","Type":"ContainerStarted","Data":"54db2e3af61de2ec9f0b86bae4410dba3c57dccf3f42911d8c95437b1e0bebe8"} Dec 03 20:24:49.961687 master-0 kubenswrapper[29252]: I1203 20:24:49.961636 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" event={"ID":"64a91869-6d5d-4f4a-8f51-9eab613c4b13","Type":"ContainerStarted","Data":"466b6dbce3a0264dedad16a8498ecd3ad7957528939c3d820e89fdef2705cf1d"} Dec 03 20:24:49.961828 master-0 kubenswrapper[29252]: I1203 20:24:49.961800 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:49.963900 master-0 kubenswrapper[29252]: I1203 20:24:49.963864 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" Dec 03 20:24:49.964301 master-0 kubenswrapper[29252]: I1203 20:24:49.964266 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" event={"ID":"12a31e7e-9fbb-49e1-b779-6050f8898ce3","Type":"ContainerStarted","Data":"4c4b28a482238ac0b3b3aad60b8d6d3f9eca2f89fb5becdd676e534d2f9c6340"} Dec 03 20:24:49.964555 master-0 kubenswrapper[29252]: I1203 20:24:49.964510 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:49.966123 master-0 kubenswrapper[29252]: I1203 20:24:49.966100 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" Dec 03 20:24:49.967068 master-0 kubenswrapper[29252]: I1203 20:24:49.967022 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" event={"ID":"eb78027d-6293-4a9a-961a-d4b57eb0e5f5","Type":"ContainerStarted","Data":"eb51770e0ac34a86d9f597ac347a12a404603f0f953b64a81dc1d2f65afaa7b2"} Dec 03 20:24:49.967287 master-0 kubenswrapper[29252]: I1203 20:24:49.967251 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:49.969092 master-0 kubenswrapper[29252]: I1203 20:24:49.969036 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" Dec 03 20:24:49.969552 master-0 kubenswrapper[29252]: I1203 20:24:49.969486 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" event={"ID":"d3595782-c6a7-4f72-99fb-44f3a68f1f6d","Type":"ContainerStarted","Data":"829b0711d67fe057928dc79532765e960048911699b77e4217c55558cb2654a1"} Dec 03 20:24:49.970538 master-0 kubenswrapper[29252]: I1203 20:24:49.970459 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:49.972528 master-0 kubenswrapper[29252]: I1203 20:24:49.972496 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" Dec 03 20:24:49.973674 master-0 kubenswrapper[29252]: I1203 20:24:49.973602 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" Dec 03 20:24:50.007401 master-0 kubenswrapper[29252]: I1203 20:24:50.007305 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-lf5ts" podStartSLOduration=16.49688314 podStartE2EDuration="31.007283736s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:34.165675814 +0000 UTC m=+908.979220787" lastFinishedPulling="2025-12-03 20:24:48.67607643 +0000 UTC m=+923.489621383" observedRunningTime="2025-12-03 20:24:49.995685152 +0000 UTC m=+924.809230125" watchObservedRunningTime="2025-12-03 20:24:50.007283736 +0000 UTC m=+924.820828709" Dec 03 20:24:50.017531 master-0 kubenswrapper[29252]: I1203 20:24:50.017479 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" Dec 03 20:24:50.067221 master-0 kubenswrapper[29252]: I1203 20:24:50.067156 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" Dec 03 20:24:50.070997 master-0 kubenswrapper[29252]: I1203 20:24:50.070893 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bhdr8" podStartSLOduration=3.617036523 podStartE2EDuration="32.070876235s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.448038531 +0000 UTC m=+895.261583484" lastFinishedPulling="2025-12-03 20:24:48.901878243 +0000 UTC m=+923.715423196" observedRunningTime="2025-12-03 20:24:50.060054991 +0000 UTC m=+924.873599944" watchObservedRunningTime="2025-12-03 20:24:50.070876235 +0000 UTC m=+924.884421188" Dec 03 20:24:50.091886 master-0 kubenswrapper[29252]: I1203 20:24:50.084717 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" Dec 03 20:24:50.124624 master-0 kubenswrapper[29252]: I1203 20:24:50.124493 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-5fcb6" podStartSLOduration=4.201478037 podStartE2EDuration="32.124468121s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:21.173982904 +0000 UTC m=+895.987527847" lastFinishedPulling="2025-12-03 20:24:49.096972978 +0000 UTC m=+923.910517931" observedRunningTime="2025-12-03 20:24:50.08295414 +0000 UTC m=+924.896499103" watchObservedRunningTime="2025-12-03 20:24:50.124468121 +0000 UTC m=+924.938013094" Dec 03 20:24:50.138863 master-0 kubenswrapper[29252]: I1203 20:24:50.126941 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-q66vk" podStartSLOduration=4.484119978 podStartE2EDuration="32.126926971s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:21.17417184 +0000 UTC m=+895.987716793" lastFinishedPulling="2025-12-03 20:24:48.816978833 +0000 UTC m=+923.630523786" observedRunningTime="2025-12-03 20:24:50.102939857 +0000 UTC m=+924.916484840" watchObservedRunningTime="2025-12-03 20:24:50.126926971 +0000 UTC m=+924.940471924" Dec 03 20:24:50.197811 master-0 kubenswrapper[29252]: I1203 20:24:50.195408 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-kp2nj" podStartSLOduration=3.21266981 podStartE2EDuration="31.19538214s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:21.170724885 +0000 UTC m=+895.984269828" lastFinishedPulling="2025-12-03 20:24:49.153437195 +0000 UTC m=+923.966982158" observedRunningTime="2025-12-03 20:24:50.154253848 +0000 UTC m=+924.967798811" watchObservedRunningTime="2025-12-03 20:24:50.19538214 +0000 UTC m=+925.008927113" Dec 03 20:24:50.238615 master-0 kubenswrapper[29252]: I1203 20:24:50.238532 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-vvm54" podStartSLOduration=15.424311448 podStartE2EDuration="31.238511471s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:32.934063826 +0000 UTC m=+907.747608819" lastFinishedPulling="2025-12-03 20:24:48.748263889 +0000 UTC m=+923.561808842" observedRunningTime="2025-12-03 20:24:50.184954466 +0000 UTC m=+924.998499439" watchObservedRunningTime="2025-12-03 20:24:50.238511471 +0000 UTC m=+925.052056424" Dec 03 20:24:50.286162 master-0 kubenswrapper[29252]: I1203 20:24:50.286076 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-78ft8" podStartSLOduration=14.350928184 podStartE2EDuration="32.28604979s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:20.158029293 +0000 UTC m=+894.971574246" lastFinishedPulling="2025-12-03 20:24:38.093150899 +0000 UTC m=+912.906695852" observedRunningTime="2025-12-03 20:24:50.222348187 +0000 UTC m=+925.035893140" watchObservedRunningTime="2025-12-03 20:24:50.28604979 +0000 UTC m=+925.099594753" Dec 03 20:24:50.904155 master-0 kubenswrapper[29252]: I1203 20:24:50.904112 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:50.908045 master-0 kubenswrapper[29252]: I1203 20:24:50.907997 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0c29a23-11dd-445c-8ebf-cef7994d7bc3-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-jcxjt\" (UID: \"e0c29a23-11dd-445c-8ebf-cef7994d7bc3\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:50.980795 master-0 kubenswrapper[29252]: I1203 20:24:50.980672 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" event={"ID":"90e0e11c-59df-46fd-9d7e-4c77a66cab18","Type":"ContainerStarted","Data":"15146f2bc0b0b28c31526c04c4ea090f3d6ae6e7b635ebf947b6668754d9d7df"} Dec 03 20:24:50.985548 master-0 kubenswrapper[29252]: I1203 20:24:50.985481 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" event={"ID":"2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0","Type":"ContainerStarted","Data":"2445582112ac6aebf4bf890208b7563fc45cc7c0cf953838f714870f82e9983c"} Dec 03 20:24:50.988484 master-0 kubenswrapper[29252]: I1203 20:24:50.988454 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" event={"ID":"f3ed1633-722c-4440-95db-2b644be51ba9","Type":"ContainerStarted","Data":"25738caccef82b8d6eb6c6e1425cce036c64e7512a72c62ad46d8bb45bd6b4ce"} Dec 03 20:24:50.991803 master-0 kubenswrapper[29252]: I1203 20:24:50.991727 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" event={"ID":"7b7362c6-9cc4-45dd-8a04-614481022860","Type":"ContainerStarted","Data":"ba9b88a0e63c7a86fe93c5c18aae6039daca0832050391462eb32612a7ebb211"} Dec 03 20:24:51.010564 master-0 kubenswrapper[29252]: I1203 20:24:51.010463 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-696b999796-z7xsv" podStartSLOduration=12.930880175 podStartE2EDuration="32.010441576s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:22.996812322 +0000 UTC m=+897.810357275" lastFinishedPulling="2025-12-03 20:24:42.076373703 +0000 UTC m=+916.889918676" observedRunningTime="2025-12-03 20:24:50.997423669 +0000 UTC m=+925.810968642" watchObservedRunningTime="2025-12-03 20:24:51.010441576 +0000 UTC m=+925.823986529" Dec 03 20:24:51.035340 master-0 kubenswrapper[29252]: I1203 20:24:51.035002 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-tgvnm" podStartSLOduration=22.984079445 podStartE2EDuration="32.034978634s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:32.930416987 +0000 UTC m=+907.743961970" lastFinishedPulling="2025-12-03 20:24:41.981316196 +0000 UTC m=+916.794861159" observedRunningTime="2025-12-03 20:24:51.027697586 +0000 UTC m=+925.841242549" watchObservedRunningTime="2025-12-03 20:24:51.034978634 +0000 UTC m=+925.848523597" Dec 03 20:24:51.048411 master-0 kubenswrapper[29252]: I1203 20:24:51.048364 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:51.056898 master-0 kubenswrapper[29252]: I1203 20:24:51.056750 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-647f96877-h76kk" podStartSLOduration=24.265652921 podStartE2EDuration="32.056721534s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:34.190273884 +0000 UTC m=+909.003818847" lastFinishedPulling="2025-12-03 20:24:41.981342507 +0000 UTC m=+916.794887460" observedRunningTime="2025-12-03 20:24:51.056274993 +0000 UTC m=+925.869819946" watchObservedRunningTime="2025-12-03 20:24:51.056721534 +0000 UTC m=+925.870266487" Dec 03 20:24:51.079847 master-0 kubenswrapper[29252]: I1203 20:24:51.079510 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-nbjnc" podStartSLOduration=24.293472468 podStartE2EDuration="32.079492248s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:34.165615053 +0000 UTC m=+908.979160006" lastFinishedPulling="2025-12-03 20:24:41.951634833 +0000 UTC m=+916.765179786" observedRunningTime="2025-12-03 20:24:51.078685519 +0000 UTC m=+925.892230472" watchObservedRunningTime="2025-12-03 20:24:51.079492248 +0000 UTC m=+925.893037211" Dec 03 20:24:51.413425 master-0 kubenswrapper[29252]: I1203 20:24:51.413241 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:51.417132 master-0 kubenswrapper[29252]: I1203 20:24:51.417045 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b08561ad-441a-4ed6-b8d2-4af65531b047-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp\" (UID: \"b08561ad-441a-4ed6-b8d2-4af65531b047\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:51.475442 master-0 kubenswrapper[29252]: I1203 20:24:51.475371 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:51.624314 master-0 kubenswrapper[29252]: I1203 20:24:51.624227 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:51.624695 master-0 kubenswrapper[29252]: I1203 20:24:51.624530 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:51.628213 master-0 kubenswrapper[29252]: I1203 20:24:51.628163 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:51.628686 master-0 kubenswrapper[29252]: I1203 20:24:51.628645 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c987116-b442-4fd5-b528-bb2540c8c37c-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-g442r\" (UID: \"0c987116-b442-4fd5-b528-bb2540c8c37c\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:51.820005 master-0 kubenswrapper[29252]: I1203 20:24:51.808036 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt"] Dec 03 20:24:51.902484 master-0 kubenswrapper[29252]: I1203 20:24:51.902421 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:51.940385 master-0 kubenswrapper[29252]: W1203 20:24:51.940323 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb08561ad_441a_4ed6_b8d2_4af65531b047.slice/crio-8498f77ed1f3160a3b6808bd634b8db69612b1c89147dff0ece79dee39197c69 WatchSource:0}: Error finding container 8498f77ed1f3160a3b6808bd634b8db69612b1c89147dff0ece79dee39197c69: Status 404 returned error can't find the container with id 8498f77ed1f3160a3b6808bd634b8db69612b1c89147dff0ece79dee39197c69 Dec 03 20:24:51.941773 master-0 kubenswrapper[29252]: I1203 20:24:51.941730 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp"] Dec 03 20:24:52.009983 master-0 kubenswrapper[29252]: I1203 20:24:52.009924 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" event={"ID":"b08561ad-441a-4ed6-b8d2-4af65531b047","Type":"ContainerStarted","Data":"8498f77ed1f3160a3b6808bd634b8db69612b1c89147dff0ece79dee39197c69"} Dec 03 20:24:52.011102 master-0 kubenswrapper[29252]: I1203 20:24:52.011042 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" event={"ID":"e0c29a23-11dd-445c-8ebf-cef7994d7bc3","Type":"ContainerStarted","Data":"daff555bdf827f516149e01bd56aa54c432a8e83a4930d0c4dc996244dfc55a8"} Dec 03 20:24:52.332446 master-0 kubenswrapper[29252]: W1203 20:24:52.332368 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c987116_b442_4fd5_b528_bb2540c8c37c.slice/crio-621f35bf39a3b41634faad0b675ee59f3fb911e8bb99f095779ea8da4d43fe01 WatchSource:0}: Error finding container 621f35bf39a3b41634faad0b675ee59f3fb911e8bb99f095779ea8da4d43fe01: Status 404 returned error can't find the container with id 621f35bf39a3b41634faad0b675ee59f3fb911e8bb99f095779ea8da4d43fe01 Dec 03 20:24:52.334670 master-0 kubenswrapper[29252]: I1203 20:24:52.334599 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r"] Dec 03 20:24:53.027469 master-0 kubenswrapper[29252]: I1203 20:24:53.027418 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" event={"ID":"0c987116-b442-4fd5-b528-bb2540c8c37c","Type":"ContainerStarted","Data":"191c2f8f4fb1a31b352fc47dc76a29149f23b4f02cb8f6e8f1f2a999fa4c6c94"} Dec 03 20:24:53.027469 master-0 kubenswrapper[29252]: I1203 20:24:53.027470 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" event={"ID":"0c987116-b442-4fd5-b528-bb2540c8c37c","Type":"ContainerStarted","Data":"621f35bf39a3b41634faad0b675ee59f3fb911e8bb99f095779ea8da4d43fe01"} Dec 03 20:24:53.028973 master-0 kubenswrapper[29252]: I1203 20:24:53.028877 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:24:53.094473 master-0 kubenswrapper[29252]: I1203 20:24:53.090118 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" podStartSLOduration=34.090080113 podStartE2EDuration="34.090080113s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:24:53.078529102 +0000 UTC m=+927.892074075" watchObservedRunningTime="2025-12-03 20:24:53.090080113 +0000 UTC m=+927.903625086" Dec 03 20:24:55.049508 master-0 kubenswrapper[29252]: I1203 20:24:55.049354 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" event={"ID":"e0c29a23-11dd-445c-8ebf-cef7994d7bc3","Type":"ContainerStarted","Data":"a89a53b327f154c9dbccda8b356ba0effa617b1364d972d608065ed5f4343332"} Dec 03 20:24:55.049508 master-0 kubenswrapper[29252]: I1203 20:24:55.049429 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" event={"ID":"e0c29a23-11dd-445c-8ebf-cef7994d7bc3","Type":"ContainerStarted","Data":"685f826051e03451b410ade44a862001fe9c86e19f6f4954b9dbe32892a406df"} Dec 03 20:24:55.049508 master-0 kubenswrapper[29252]: I1203 20:24:55.049469 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:24:55.052030 master-0 kubenswrapper[29252]: I1203 20:24:55.051993 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" event={"ID":"b08561ad-441a-4ed6-b8d2-4af65531b047","Type":"ContainerStarted","Data":"c0811b82096970713b5094e4832e15b9083484ae589186af575497406b9c7386"} Dec 03 20:24:55.052125 master-0 kubenswrapper[29252]: I1203 20:24:55.052033 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" event={"ID":"b08561ad-441a-4ed6-b8d2-4af65531b047","Type":"ContainerStarted","Data":"56971ebd34d4e91830f58381043479eed5f0ae9c7f158cb6225dc5d00154e51f"} Dec 03 20:24:55.052255 master-0 kubenswrapper[29252]: I1203 20:24:55.052204 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:24:55.090066 master-0 kubenswrapper[29252]: I1203 20:24:55.089982 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" podStartSLOduration=34.424582973 podStartE2EDuration="37.089963857s" podCreationTimestamp="2025-12-03 20:24:18 +0000 UTC" firstStartedPulling="2025-12-03 20:24:51.825531092 +0000 UTC m=+926.639076045" lastFinishedPulling="2025-12-03 20:24:54.490911976 +0000 UTC m=+929.304456929" observedRunningTime="2025-12-03 20:24:55.087019106 +0000 UTC m=+929.900564079" watchObservedRunningTime="2025-12-03 20:24:55.089963857 +0000 UTC m=+929.903508810" Dec 03 20:24:55.130304 master-0 kubenswrapper[29252]: I1203 20:24:55.130156 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" podStartSLOduration=33.554379026 podStartE2EDuration="36.130125465s" podCreationTimestamp="2025-12-03 20:24:19 +0000 UTC" firstStartedPulling="2025-12-03 20:24:51.942646597 +0000 UTC m=+926.756191550" lastFinishedPulling="2025-12-03 20:24:54.518393026 +0000 UTC m=+929.331937989" observedRunningTime="2025-12-03 20:24:55.116548135 +0000 UTC m=+929.930093108" watchObservedRunningTime="2025-12-03 20:24:55.130125465 +0000 UTC m=+929.943670418" Dec 03 20:25:01.056756 master-0 kubenswrapper[29252]: I1203 20:25:01.056687 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-jcxjt" Dec 03 20:25:01.496889 master-0 kubenswrapper[29252]: I1203 20:25:01.489204 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp" Dec 03 20:25:01.909007 master-0 kubenswrapper[29252]: I1203 20:25:01.908943 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-g442r" Dec 03 20:29:23.311049 master-0 kubenswrapper[29252]: I1203 20:29:23.310938 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:23.314751 master-0 kubenswrapper[29252]: I1203 20:29:23.314704 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.337401 master-0 kubenswrapper[29252]: I1203 20:29:23.337324 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:23.502637 master-0 kubenswrapper[29252]: I1203 20:29:23.502554 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.502912 master-0 kubenswrapper[29252]: I1203 20:29:23.502644 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbbwm\" (UniqueName: \"kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.502912 master-0 kubenswrapper[29252]: I1203 20:29:23.502723 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.603656 master-0 kubenswrapper[29252]: I1203 20:29:23.603522 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.603656 master-0 kubenswrapper[29252]: I1203 20:29:23.603581 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbbwm\" (UniqueName: \"kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.603656 master-0 kubenswrapper[29252]: I1203 20:29:23.603619 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.604108 master-0 kubenswrapper[29252]: I1203 20:29:23.604074 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.604191 master-0 kubenswrapper[29252]: I1203 20:29:23.604165 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.623079 master-0 kubenswrapper[29252]: I1203 20:29:23.623027 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbbwm\" (UniqueName: \"kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm\") pod \"redhat-marketplace-wl4bp\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.649907 master-0 kubenswrapper[29252]: I1203 20:29:23.649853 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:23.929802 master-0 kubenswrapper[29252]: I1203 20:29:23.929717 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:23.937691 master-0 kubenswrapper[29252]: W1203 20:29:23.937622 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440f7b6a_2225_45ae_8159_478cc4ee8b81.slice/crio-442f525287c39d80d7fcecf1437a9476886883e1c8580122e456bda0164d02a7 WatchSource:0}: Error finding container 442f525287c39d80d7fcecf1437a9476886883e1c8580122e456bda0164d02a7: Status 404 returned error can't find the container with id 442f525287c39d80d7fcecf1437a9476886883e1c8580122e456bda0164d02a7 Dec 03 20:29:24.362896 master-0 kubenswrapper[29252]: I1203 20:29:24.362817 29252 generic.go:334] "Generic (PLEG): container finished" podID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerID="6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3" exitCode=0 Dec 03 20:29:24.363409 master-0 kubenswrapper[29252]: I1203 20:29:24.362908 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerDied","Data":"6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3"} Dec 03 20:29:24.363409 master-0 kubenswrapper[29252]: I1203 20:29:24.362955 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerStarted","Data":"442f525287c39d80d7fcecf1437a9476886883e1c8580122e456bda0164d02a7"} Dec 03 20:29:25.377755 master-0 kubenswrapper[29252]: I1203 20:29:25.377672 29252 generic.go:334] "Generic (PLEG): container finished" podID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerID="de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766" exitCode=0 Dec 03 20:29:25.378549 master-0 kubenswrapper[29252]: I1203 20:29:25.377767 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerDied","Data":"de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766"} Dec 03 20:29:26.394951 master-0 kubenswrapper[29252]: I1203 20:29:26.394843 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerStarted","Data":"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8"} Dec 03 20:29:26.430725 master-0 kubenswrapper[29252]: I1203 20:29:26.430602 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wl4bp" podStartSLOduration=2.01512176 podStartE2EDuration="3.430571636s" podCreationTimestamp="2025-12-03 20:29:23 +0000 UTC" firstStartedPulling="2025-12-03 20:29:24.364528819 +0000 UTC m=+1199.178073772" lastFinishedPulling="2025-12-03 20:29:25.779978695 +0000 UTC m=+1200.593523648" observedRunningTime="2025-12-03 20:29:26.420053195 +0000 UTC m=+1201.233598158" watchObservedRunningTime="2025-12-03 20:29:26.430571636 +0000 UTC m=+1201.244116599" Dec 03 20:29:33.650538 master-0 kubenswrapper[29252]: I1203 20:29:33.650393 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:33.650538 master-0 kubenswrapper[29252]: I1203 20:29:33.650500 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:33.705578 master-0 kubenswrapper[29252]: I1203 20:29:33.705485 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:34.521268 master-0 kubenswrapper[29252]: I1203 20:29:34.521185 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:34.585509 master-0 kubenswrapper[29252]: I1203 20:29:34.585430 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:36.507415 master-0 kubenswrapper[29252]: I1203 20:29:36.507287 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wl4bp" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="registry-server" containerID="cri-o://985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8" gracePeriod=2 Dec 03 20:29:37.015631 master-0 kubenswrapper[29252]: I1203 20:29:37.015551 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:37.177299 master-0 kubenswrapper[29252]: I1203 20:29:37.177214 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities\") pod \"440f7b6a-2225-45ae-8159-478cc4ee8b81\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " Dec 03 20:29:37.177528 master-0 kubenswrapper[29252]: I1203 20:29:37.177403 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbbwm\" (UniqueName: \"kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm\") pod \"440f7b6a-2225-45ae-8159-478cc4ee8b81\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " Dec 03 20:29:37.177528 master-0 kubenswrapper[29252]: I1203 20:29:37.177431 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content\") pod \"440f7b6a-2225-45ae-8159-478cc4ee8b81\" (UID: \"440f7b6a-2225-45ae-8159-478cc4ee8b81\") " Dec 03 20:29:37.179329 master-0 kubenswrapper[29252]: I1203 20:29:37.179264 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities" (OuterVolumeSpecName: "utilities") pod "440f7b6a-2225-45ae-8159-478cc4ee8b81" (UID: "440f7b6a-2225-45ae-8159-478cc4ee8b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:29:37.180607 master-0 kubenswrapper[29252]: I1203 20:29:37.180507 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm" (OuterVolumeSpecName: "kube-api-access-rbbwm") pod "440f7b6a-2225-45ae-8159-478cc4ee8b81" (UID: "440f7b6a-2225-45ae-8159-478cc4ee8b81"). InnerVolumeSpecName "kube-api-access-rbbwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:29:37.200231 master-0 kubenswrapper[29252]: I1203 20:29:37.200147 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "440f7b6a-2225-45ae-8159-478cc4ee8b81" (UID: "440f7b6a-2225-45ae-8159-478cc4ee8b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:29:37.279633 master-0 kubenswrapper[29252]: I1203 20:29:37.279561 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbbwm\" (UniqueName: \"kubernetes.io/projected/440f7b6a-2225-45ae-8159-478cc4ee8b81-kube-api-access-rbbwm\") on node \"master-0\" DevicePath \"\"" Dec 03 20:29:37.279962 master-0 kubenswrapper[29252]: I1203 20:29:37.279761 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:29:37.279962 master-0 kubenswrapper[29252]: I1203 20:29:37.279802 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440f7b6a-2225-45ae-8159-478cc4ee8b81-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:29:37.522991 master-0 kubenswrapper[29252]: I1203 20:29:37.522909 29252 generic.go:334] "Generic (PLEG): container finished" podID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerID="985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8" exitCode=0 Dec 03 20:29:37.522991 master-0 kubenswrapper[29252]: I1203 20:29:37.522984 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerDied","Data":"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8"} Dec 03 20:29:37.523718 master-0 kubenswrapper[29252]: I1203 20:29:37.523027 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl4bp" event={"ID":"440f7b6a-2225-45ae-8159-478cc4ee8b81","Type":"ContainerDied","Data":"442f525287c39d80d7fcecf1437a9476886883e1c8580122e456bda0164d02a7"} Dec 03 20:29:37.523718 master-0 kubenswrapper[29252]: I1203 20:29:37.523071 29252 scope.go:117] "RemoveContainer" containerID="985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8" Dec 03 20:29:37.523718 master-0 kubenswrapper[29252]: I1203 20:29:37.523114 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl4bp" Dec 03 20:29:37.563411 master-0 kubenswrapper[29252]: I1203 20:29:37.563328 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:37.567914 master-0 kubenswrapper[29252]: I1203 20:29:37.567864 29252 scope.go:117] "RemoveContainer" containerID="de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766" Dec 03 20:29:37.570058 master-0 kubenswrapper[29252]: I1203 20:29:37.569943 29252 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl4bp"] Dec 03 20:29:37.605762 master-0 kubenswrapper[29252]: I1203 20:29:37.605718 29252 scope.go:117] "RemoveContainer" containerID="6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3" Dec 03 20:29:37.633745 master-0 kubenswrapper[29252]: I1203 20:29:37.633691 29252 scope.go:117] "RemoveContainer" containerID="985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8" Dec 03 20:29:37.634320 master-0 kubenswrapper[29252]: E1203 20:29:37.634247 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8\": container with ID starting with 985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8 not found: ID does not exist" containerID="985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8" Dec 03 20:29:37.634424 master-0 kubenswrapper[29252]: I1203 20:29:37.634321 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8"} err="failed to get container status \"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8\": rpc error: code = NotFound desc = could not find container \"985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8\": container with ID starting with 985272ba444868b52db3772484e1fee07ada214548abdc4fa7f2ba5b604cfee8 not found: ID does not exist" Dec 03 20:29:37.634424 master-0 kubenswrapper[29252]: I1203 20:29:37.634360 29252 scope.go:117] "RemoveContainer" containerID="de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766" Dec 03 20:29:37.634721 master-0 kubenswrapper[29252]: E1203 20:29:37.634680 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766\": container with ID starting with de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766 not found: ID does not exist" containerID="de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766" Dec 03 20:29:37.634803 master-0 kubenswrapper[29252]: I1203 20:29:37.634717 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766"} err="failed to get container status \"de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766\": rpc error: code = NotFound desc = could not find container \"de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766\": container with ID starting with de95ad097998e182e2ceca50f5e790a86709b6d7100cf21b1d2cb8ff49823766 not found: ID does not exist" Dec 03 20:29:37.634803 master-0 kubenswrapper[29252]: I1203 20:29:37.634744 29252 scope.go:117] "RemoveContainer" containerID="6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3" Dec 03 20:29:37.635085 master-0 kubenswrapper[29252]: E1203 20:29:37.635038 29252 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3\": container with ID starting with 6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3 not found: ID does not exist" containerID="6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3" Dec 03 20:29:37.635147 master-0 kubenswrapper[29252]: I1203 20:29:37.635081 29252 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3"} err="failed to get container status \"6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3\": rpc error: code = NotFound desc = could not find container \"6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3\": container with ID starting with 6904ee7263e52ca553f3a569a4f813941dd623a3c84cea71a5c8fe08171650a3 not found: ID does not exist" Dec 03 20:29:38.779828 master-0 kubenswrapper[29252]: I1203 20:29:38.779716 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfnv8/must-gather-4tdgk"] Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: E1203 20:29:38.780153 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="extract-content" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: I1203 20:29:38.780171 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="extract-content" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: E1203 20:29:38.780184 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="extract-utilities" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: I1203 20:29:38.780191 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="extract-utilities" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: E1203 20:29:38.780214 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="registry-server" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: I1203 20:29:38.780221 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="registry-server" Dec 03 20:29:38.780898 master-0 kubenswrapper[29252]: I1203 20:29:38.780450 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" containerName="registry-server" Dec 03 20:29:38.781389 master-0 kubenswrapper[29252]: I1203 20:29:38.781337 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:38.783702 master-0 kubenswrapper[29252]: I1203 20:29:38.783664 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xfnv8"/"openshift-service-ca.crt" Dec 03 20:29:38.783978 master-0 kubenswrapper[29252]: I1203 20:29:38.783916 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xfnv8"/"kube-root-ca.crt" Dec 03 20:29:38.794068 master-0 kubenswrapper[29252]: I1203 20:29:38.794021 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfnv8/must-gather-fgtkq"] Dec 03 20:29:38.795668 master-0 kubenswrapper[29252]: I1203 20:29:38.795636 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:38.805137 master-0 kubenswrapper[29252]: I1203 20:29:38.805072 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/must-gather-4tdgk"] Dec 03 20:29:38.817984 master-0 kubenswrapper[29252]: I1203 20:29:38.817924 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/must-gather-fgtkq"] Dec 03 20:29:38.913799 master-0 kubenswrapper[29252]: I1203 20:29:38.913715 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cef27c62-3b3e-46eb-a79f-63603ce5730e-must-gather-output\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:38.914028 master-0 kubenswrapper[29252]: I1203 20:29:38.913921 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt5tw\" (UniqueName: \"kubernetes.io/projected/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-kube-api-access-wt5tw\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:38.914191 master-0 kubenswrapper[29252]: I1203 20:29:38.914148 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-must-gather-output\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:38.914322 master-0 kubenswrapper[29252]: I1203 20:29:38.914294 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvpd\" (UniqueName: \"kubernetes.io/projected/cef27c62-3b3e-46eb-a79f-63603ce5730e-kube-api-access-8hvpd\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.015510 master-0 kubenswrapper[29252]: I1203 20:29:39.015450 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cef27c62-3b3e-46eb-a79f-63603ce5730e-must-gather-output\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.015825 master-0 kubenswrapper[29252]: I1203 20:29:39.015543 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt5tw\" (UniqueName: \"kubernetes.io/projected/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-kube-api-access-wt5tw\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:39.015825 master-0 kubenswrapper[29252]: I1203 20:29:39.015619 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-must-gather-output\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:39.015825 master-0 kubenswrapper[29252]: I1203 20:29:39.015684 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvpd\" (UniqueName: \"kubernetes.io/projected/cef27c62-3b3e-46eb-a79f-63603ce5730e-kube-api-access-8hvpd\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.019807 master-0 kubenswrapper[29252]: I1203 20:29:39.016651 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cef27c62-3b3e-46eb-a79f-63603ce5730e-must-gather-output\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.019807 master-0 kubenswrapper[29252]: I1203 20:29:39.017183 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-must-gather-output\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:39.032953 master-0 kubenswrapper[29252]: I1203 20:29:39.032859 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt5tw\" (UniqueName: \"kubernetes.io/projected/b3c6362b-bd1e-43b4-b8a9-1501dcd35c69-kube-api-access-wt5tw\") pod \"must-gather-4tdgk\" (UID: \"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69\") " pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:39.034916 master-0 kubenswrapper[29252]: I1203 20:29:39.034877 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvpd\" (UniqueName: \"kubernetes.io/projected/cef27c62-3b3e-46eb-a79f-63603ce5730e-kube-api-access-8hvpd\") pod \"must-gather-fgtkq\" (UID: \"cef27c62-3b3e-46eb-a79f-63603ce5730e\") " pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.115911 master-0 kubenswrapper[29252]: I1203 20:29:39.115834 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" Dec 03 20:29:39.130855 master-0 kubenswrapper[29252]: I1203 20:29:39.130808 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" Dec 03 20:29:39.493842 master-0 kubenswrapper[29252]: I1203 20:29:39.491327 29252 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440f7b6a-2225-45ae-8159-478cc4ee8b81" path="/var/lib/kubelet/pods/440f7b6a-2225-45ae-8159-478cc4ee8b81/volumes" Dec 03 20:29:39.595894 master-0 kubenswrapper[29252]: I1203 20:29:39.595735 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/must-gather-fgtkq"] Dec 03 20:29:39.606826 master-0 kubenswrapper[29252]: W1203 20:29:39.606601 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcef27c62_3b3e_46eb_a79f_63603ce5730e.slice/crio-ac6ccd101438dc1f99b7268e8ead4bb6e0a214f65e9fb20e528c5b8c3a1832d7 WatchSource:0}: Error finding container ac6ccd101438dc1f99b7268e8ead4bb6e0a214f65e9fb20e528c5b8c3a1832d7: Status 404 returned error can't find the container with id ac6ccd101438dc1f99b7268e8ead4bb6e0a214f65e9fb20e528c5b8c3a1832d7 Dec 03 20:29:39.608132 master-0 kubenswrapper[29252]: I1203 20:29:39.608087 29252 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 20:29:39.721578 master-0 kubenswrapper[29252]: I1203 20:29:39.721528 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/must-gather-4tdgk"] Dec 03 20:29:40.561563 master-0 kubenswrapper[29252]: I1203 20:29:40.561505 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" event={"ID":"cef27c62-3b3e-46eb-a79f-63603ce5730e","Type":"ContainerStarted","Data":"ac6ccd101438dc1f99b7268e8ead4bb6e0a214f65e9fb20e528c5b8c3a1832d7"} Dec 03 20:29:40.563148 master-0 kubenswrapper[29252]: I1203 20:29:40.563105 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" event={"ID":"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69","Type":"ContainerStarted","Data":"05051df7c26a405174e9edfe0d48543ed0f36bf09c73e99d8a5a27a84a182c1e"} Dec 03 20:29:41.544740 master-0 kubenswrapper[29252]: I1203 20:29:41.544602 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7c49fbfc6f-q5wsd_b8709c6c-8729-4702-a3fb-35a072855096/cluster-version-operator/0.log" Dec 03 20:29:41.586037 master-0 kubenswrapper[29252]: I1203 20:29:41.585510 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" event={"ID":"cef27c62-3b3e-46eb-a79f-63603ce5730e","Type":"ContainerStarted","Data":"567038348e47c65ade041bde49cfbaa8d74e70ea19b9119fdd303ae59c99416c"} Dec 03 20:29:41.586037 master-0 kubenswrapper[29252]: I1203 20:29:41.585573 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" event={"ID":"cef27c62-3b3e-46eb-a79f-63603ce5730e","Type":"ContainerStarted","Data":"59db1e5280313e7e2fb5a27003fcb97c57606251396dd580423d3d92c3da067d"} Dec 03 20:29:41.607865 master-0 kubenswrapper[29252]: I1203 20:29:41.607762 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfnv8/must-gather-fgtkq" podStartSLOduration=2.608062994 podStartE2EDuration="3.607743845s" podCreationTimestamp="2025-12-03 20:29:38 +0000 UTC" firstStartedPulling="2025-12-03 20:29:39.607996493 +0000 UTC m=+1214.421541446" lastFinishedPulling="2025-12-03 20:29:40.607677344 +0000 UTC m=+1215.421222297" observedRunningTime="2025-12-03 20:29:41.60111507 +0000 UTC m=+1216.414660023" watchObservedRunningTime="2025-12-03 20:29:41.607743845 +0000 UTC m=+1216.421288798" Dec 03 20:29:42.573484 master-0 kubenswrapper[29252]: I1203 20:29:42.571948 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7c49fbfc6f-q5wsd_b8709c6c-8729-4702-a3fb-35a072855096/cluster-version-operator/1.log" Dec 03 20:29:46.600796 master-0 kubenswrapper[29252]: I1203 20:29:46.600091 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-96nlf_1ae49184-91db-4355-b553-8cc5506e80bc/controller/0.log" Dec 03 20:29:46.615795 master-0 kubenswrapper[29252]: I1203 20:29:46.614341 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-96nlf_1ae49184-91db-4355-b553-8cc5506e80bc/kube-rbac-proxy/0.log" Dec 03 20:29:46.657799 master-0 kubenswrapper[29252]: I1203 20:29:46.656341 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/controller/0.log" Dec 03 20:29:46.800675 master-0 kubenswrapper[29252]: I1203 20:29:46.799728 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/frr/0.log" Dec 03 20:29:46.821402 master-0 kubenswrapper[29252]: I1203 20:29:46.821336 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/reloader/0.log" Dec 03 20:29:46.831955 master-0 kubenswrapper[29252]: I1203 20:29:46.831909 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/frr-metrics/0.log" Dec 03 20:29:46.845741 master-0 kubenswrapper[29252]: I1203 20:29:46.844300 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/kube-rbac-proxy/0.log" Dec 03 20:29:46.857900 master-0 kubenswrapper[29252]: I1203 20:29:46.857400 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/kube-rbac-proxy-frr/0.log" Dec 03 20:29:46.871631 master-0 kubenswrapper[29252]: I1203 20:29:46.871577 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-frr-files/0.log" Dec 03 20:29:46.881586 master-0 kubenswrapper[29252]: I1203 20:29:46.881515 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-reloader/0.log" Dec 03 20:29:46.893654 master-0 kubenswrapper[29252]: I1203 20:29:46.893561 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-metrics/0.log" Dec 03 20:29:46.907651 master-0 kubenswrapper[29252]: I1203 20:29:46.907589 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-58pgc_2bebbe36-46ba-47e9-b53e-2c83abe9c329/frr-k8s-webhook-server/0.log" Dec 03 20:29:46.927560 master-0 kubenswrapper[29252]: I1203 20:29:46.926453 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d9499bd6f-4h4ww_cb6f03f1-adfa-4249-b822-7dd4acf245be/manager/0.log" Dec 03 20:29:46.945735 master-0 kubenswrapper[29252]: I1203 20:29:46.945694 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bfbbd6984-sx9nv_5a2dc21d-ada3-4739-9e62-cbdba8e4985a/webhook-server/0.log" Dec 03 20:29:46.983392 master-0 kubenswrapper[29252]: I1203 20:29:46.983345 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-9nn22_7010359f-6e7a-41c9-9a49-f29d67babf3c/nmstate-console-plugin/0.log" Dec 03 20:29:47.017838 master-0 kubenswrapper[29252]: I1203 20:29:47.017798 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zfhdt_0944c190-d25f-481e-b59a-75869f8dc9e2/nmstate-handler/0.log" Dec 03 20:29:47.061470 master-0 kubenswrapper[29252]: I1203 20:29:47.061435 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xmxkb_6980a7eb-a3c8-4496-87aa-b56680009c84/nmstate-metrics/0.log" Dec 03 20:29:47.069272 master-0 kubenswrapper[29252]: I1203 20:29:47.069219 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j26tj_52511992-a397-485f-b709-f81257ee8e16/speaker/0.log" Dec 03 20:29:47.075106 master-0 kubenswrapper[29252]: I1203 20:29:47.075072 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xmxkb_6980a7eb-a3c8-4496-87aa-b56680009c84/kube-rbac-proxy/0.log" Dec 03 20:29:47.079222 master-0 kubenswrapper[29252]: I1203 20:29:47.079199 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j26tj_52511992-a397-485f-b709-f81257ee8e16/kube-rbac-proxy/0.log" Dec 03 20:29:47.089923 master-0 kubenswrapper[29252]: I1203 20:29:47.089849 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-l9xts_bc12a15e-d84d-430c-a33c-833407ab976d/nmstate-operator/0.log" Dec 03 20:29:47.110658 master-0 kubenswrapper[29252]: I1203 20:29:47.110613 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9cg4d_cabd9912-85fb-4fca-a116-1c9bf1ab19e1/nmstate-webhook/0.log" Dec 03 20:29:47.957576 master-0 kubenswrapper[29252]: I1203 20:29:47.957535 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcdctl/0.log" Dec 03 20:29:48.239962 master-0 kubenswrapper[29252]: I1203 20:29:48.239721 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd/0.log" Dec 03 20:29:48.257288 master-0 kubenswrapper[29252]: I1203 20:29:48.257234 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-metrics/0.log" Dec 03 20:29:48.271113 master-0 kubenswrapper[29252]: I1203 20:29:48.271070 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-readyz/0.log" Dec 03 20:29:48.289375 master-0 kubenswrapper[29252]: I1203 20:29:48.289283 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-rev/0.log" Dec 03 20:29:48.298488 master-0 kubenswrapper[29252]: I1203 20:29:48.298445 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/setup/0.log" Dec 03 20:29:48.318701 master-0 kubenswrapper[29252]: I1203 20:29:48.318672 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-ensure-env-vars/0.log" Dec 03 20:29:48.332759 master-0 kubenswrapper[29252]: I1203 20:29:48.332709 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-resources-copy/0.log" Dec 03 20:29:48.372938 master-0 kubenswrapper[29252]: I1203 20:29:48.371833 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_bacd155a-fee3-4e5e-89a2-ab86f401d2ff/installer/0.log" Dec 03 20:29:49.651824 master-0 kubenswrapper[29252]: I1203 20:29:49.651632 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-ljsns_0b6e1832-278b-4e37-b92b-2584e2daa34c/assisted-installer-controller/0.log" Dec 03 20:29:50.054932 master-0 kubenswrapper[29252]: I1203 20:29:50.054898 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-65494c7-95b4g_d45f85ea-34d0-4a87-9c62-39101a4756af/oauth-openshift/0.log" Dec 03 20:29:51.678503 master-0 kubenswrapper[29252]: I1203 20:29:51.678448 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/5.log" Dec 03 20:29:51.732147 master-0 kubenswrapper[29252]: I1203 20:29:51.732088 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-mfwhz_a185ee17-4b4b-4d20-a8ed-56a2a01f1807/authentication-operator/6.log" Dec 03 20:29:52.330982 master-0 kubenswrapper[29252]: I1203 20:29:52.330932 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54f97f57-l5rb2_a4f90ab0-b480-44a2-b87d-220ab6bba9c5/router/0.log" Dec 03 20:29:52.789106 master-0 kubenswrapper[29252]: I1203 20:29:52.789047 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" event={"ID":"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69","Type":"ContainerStarted","Data":"8cd106d0923d9e9f9bb225d5d11c1cd41013199ed62f54d6ba411556ea7f913b"} Dec 03 20:29:52.789106 master-0 kubenswrapper[29252]: I1203 20:29:52.789101 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" event={"ID":"b3c6362b-bd1e-43b4-b8a9-1501dcd35c69","Type":"ContainerStarted","Data":"b72cba2d62fc9d59f1fc011d5e2ee2103acd0e38f7d3a91767012e1eb6cf9e15"} Dec 03 20:29:52.812472 master-0 kubenswrapper[29252]: I1203 20:29:52.812385 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfnv8/must-gather-4tdgk" podStartSLOduration=3.162186801 podStartE2EDuration="14.812368738s" podCreationTimestamp="2025-12-03 20:29:38 +0000 UTC" firstStartedPulling="2025-12-03 20:29:39.725922699 +0000 UTC m=+1214.539467652" lastFinishedPulling="2025-12-03 20:29:51.376104636 +0000 UTC m=+1226.189649589" observedRunningTime="2025-12-03 20:29:52.806497302 +0000 UTC m=+1227.620042275" watchObservedRunningTime="2025-12-03 20:29:52.812368738 +0000 UTC m=+1227.625913701" Dec 03 20:29:53.037135 master-0 kubenswrapper[29252]: I1203 20:29:53.037087 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-597ff7d589-qjxsb_f96c70ce-314a-4919-91e9-cc776a620846/oauth-apiserver/0.log" Dec 03 20:29:53.052865 master-0 kubenswrapper[29252]: I1203 20:29:53.052820 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-597ff7d589-qjxsb_f96c70ce-314a-4919-91e9-cc776a620846/fix-audit-permissions/0.log" Dec 03 20:29:53.201341 master-0 kubenswrapper[29252]: I1203 20:29:53.201293 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w"] Dec 03 20:29:53.202444 master-0 kubenswrapper[29252]: I1203 20:29:53.202429 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.233166 master-0 kubenswrapper[29252]: I1203 20:29:53.233069 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w"] Dec 03 20:29:53.253421 master-0 kubenswrapper[29252]: I1203 20:29:53.253324 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-proc\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.253421 master-0 kubenswrapper[29252]: I1203 20:29:53.253408 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-sys\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.253769 master-0 kubenswrapper[29252]: I1203 20:29:53.253527 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-lib-modules\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.253769 master-0 kubenswrapper[29252]: I1203 20:29:53.253573 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dtwn\" (UniqueName: \"kubernetes.io/projected/cc09feae-23df-4f5f-853e-ec5201baf522-kube-api-access-4dtwn\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.253769 master-0 kubenswrapper[29252]: I1203 20:29:53.253747 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-podres\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.354996 master-0 kubenswrapper[29252]: I1203 20:29:53.354878 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-podres\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.354996 master-0 kubenswrapper[29252]: I1203 20:29:53.354961 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-proc\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.354996 master-0 kubenswrapper[29252]: I1203 20:29:53.354984 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-sys\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355029 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-lib-modules\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355059 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dtwn\" (UniqueName: \"kubernetes.io/projected/cc09feae-23df-4f5f-853e-ec5201baf522-kube-api-access-4dtwn\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355083 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-podres\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355162 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-sys\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355199 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-proc\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.355274 master-0 kubenswrapper[29252]: I1203 20:29:53.355247 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc09feae-23df-4f5f-853e-ec5201baf522-lib-modules\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.376602 master-0 kubenswrapper[29252]: I1203 20:29:53.376539 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dtwn\" (UniqueName: \"kubernetes.io/projected/cc09feae-23df-4f5f-853e-ec5201baf522-kube-api-access-4dtwn\") pod \"perf-node-gather-daemonset-b8c2w\" (UID: \"cc09feae-23df-4f5f-853e-ec5201baf522\") " pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.522977 master-0 kubenswrapper[29252]: I1203 20:29:53.522918 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:53.813218 master-0 kubenswrapper[29252]: I1203 20:29:53.813105 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kqfs4_b2021db5-b27a-4e06-beec-d9ba82aa1ffc/kube-rbac-proxy/0.log" Dec 03 20:29:53.863976 master-0 kubenswrapper[29252]: I1203 20:29:53.861239 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kqfs4_b2021db5-b27a-4e06-beec-d9ba82aa1ffc/cluster-autoscaler-operator/0.log" Dec 03 20:29:53.877522 master-0 kubenswrapper[29252]: I1203 20:29:53.877386 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/2.log" Dec 03 20:29:53.881040 master-0 kubenswrapper[29252]: I1203 20:29:53.880915 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/cluster-baremetal-operator/3.log" Dec 03 20:29:53.892521 master-0 kubenswrapper[29252]: I1203 20:29:53.891941 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q47xb_433c3273-c99e-4d68-befc-06f92d2fc8d5/baremetal-kube-rbac-proxy/0.log" Dec 03 20:29:53.909014 master-0 kubenswrapper[29252]: I1203 20:29:53.908970 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-2llfg_cd35fc5f-07ab-4c66-9b80-33a598d417ef/control-plane-machine-set-operator/0.log" Dec 03 20:29:53.935832 master-0 kubenswrapper[29252]: I1203 20:29:53.931265 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-9p9rq_ad22d8ed-2476-441b-aa3b-a7845606b0ac/kube-rbac-proxy/0.log" Dec 03 20:29:53.956871 master-0 kubenswrapper[29252]: I1203 20:29:53.955737 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-9p9rq_ad22d8ed-2476-441b-aa3b-a7845606b0ac/machine-api-operator/0.log" Dec 03 20:29:53.972967 master-0 kubenswrapper[29252]: W1203 20:29:53.972906 29252 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc09feae_23df_4f5f_853e_ec5201baf522.slice/crio-4beb9dd322f51defedb95f659009ac0823fe584b6c9d8d6ba3863b958707c105 WatchSource:0}: Error finding container 4beb9dd322f51defedb95f659009ac0823fe584b6c9d8d6ba3863b958707c105: Status 404 returned error can't find the container with id 4beb9dd322f51defedb95f659009ac0823fe584b6c9d8d6ba3863b958707c105 Dec 03 20:29:53.973125 master-0 kubenswrapper[29252]: I1203 20:29:53.973072 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w"] Dec 03 20:29:54.826464 master-0 kubenswrapper[29252]: I1203 20:29:54.826403 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" event={"ID":"cc09feae-23df-4f5f-853e-ec5201baf522","Type":"ContainerStarted","Data":"6c06d2497128f6f32c0979b92c39bd5242c99f1db35409dde99d044ef60a1155"} Dec 03 20:29:54.826464 master-0 kubenswrapper[29252]: I1203 20:29:54.826456 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" event={"ID":"cc09feae-23df-4f5f-853e-ec5201baf522","Type":"ContainerStarted","Data":"4beb9dd322f51defedb95f659009ac0823fe584b6c9d8d6ba3863b958707c105"} Dec 03 20:29:54.827530 master-0 kubenswrapper[29252]: I1203 20:29:54.827505 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:29:54.849386 master-0 kubenswrapper[29252]: I1203 20:29:54.849306 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" podStartSLOduration=1.849285891 podStartE2EDuration="1.849285891s" podCreationTimestamp="2025-12-03 20:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 20:29:54.844243586 +0000 UTC m=+1229.657788559" watchObservedRunningTime="2025-12-03 20:29:54.849285891 +0000 UTC m=+1229.662830844" Dec 03 20:29:55.218397 master-0 kubenswrapper[29252]: I1203 20:29:55.218341 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75_90610a53-b590-491e-8014-f0704afdc6e1/cluster-cloud-controller-manager/0.log" Dec 03 20:29:55.246412 master-0 kubenswrapper[29252]: I1203 20:29:55.246361 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75_90610a53-b590-491e-8014-f0704afdc6e1/config-sync-controllers/0.log" Dec 03 20:29:55.256426 master-0 kubenswrapper[29252]: I1203 20:29:55.256385 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2h75_90610a53-b590-491e-8014-f0704afdc6e1/kube-rbac-proxy/0.log" Dec 03 20:29:56.856305 master-0 kubenswrapper[29252]: I1203 20:29:56.856251 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-7c4dc67499-lqdlr_6404bbc7-8ca9-4f20-8ce7-40f855555160/kube-rbac-proxy/0.log" Dec 03 20:29:56.888949 master-0 kubenswrapper[29252]: I1203 20:29:56.888895 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-7c4dc67499-lqdlr_6404bbc7-8ca9-4f20-8ce7-40f855555160/cloud-credential-operator/0.log" Dec 03 20:29:58.555612 master-0 kubenswrapper[29252]: I1203 20:29:58.555570 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/4.log" Dec 03 20:29:58.557803 master-0 kubenswrapper[29252]: I1203 20:29:58.557768 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-config-operator/5.log" Dec 03 20:29:58.581314 master-0 kubenswrapper[29252]: I1203 20:29:58.581274 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-8xmrv_0ac1ae27-c34b-4bab-9f60-b2e2f9ad18b9/openshift-api/0.log" Dec 03 20:29:59.407459 master-0 kubenswrapper[29252]: I1203 20:29:59.407411 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77df56447c-vnp7f_655df528-d475-4908-ba38-a5a646744484/console-operator/0.log" Dec 03 20:29:59.973172 master-0 kubenswrapper[29252]: I1203 20:29:59.973115 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7ffc94f8bc-2s94b_f570066d-b49f-40fa-b901-9a89f265d1b1/console/0.log" Dec 03 20:29:59.994762 master-0 kubenswrapper[29252]: I1203 20:29:59.994712 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6f5db8559b-hd8bd_68a9a9c5-fd3f-4a9b-ba9f-6a02dbb70f64/download-server/0.log" Dec 03 20:30:00.175621 master-0 kubenswrapper[29252]: I1203 20:30:00.175563 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx"] Dec 03 20:30:00.177298 master-0 kubenswrapper[29252]: I1203 20:30:00.177271 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.180057 master-0 kubenswrapper[29252]: I1203 20:30:00.180024 29252 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nqzgz" Dec 03 20:30:00.180294 master-0 kubenswrapper[29252]: I1203 20:30:00.180185 29252 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 20:30:00.194878 master-0 kubenswrapper[29252]: I1203 20:30:00.194618 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx"] Dec 03 20:30:00.292921 master-0 kubenswrapper[29252]: I1203 20:30:00.291204 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.292921 master-0 kubenswrapper[29252]: I1203 20:30:00.291285 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.292921 master-0 kubenswrapper[29252]: I1203 20:30:00.291571 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2sg\" (UniqueName: \"kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.393243 master-0 kubenswrapper[29252]: I1203 20:30:00.393155 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.393243 master-0 kubenswrapper[29252]: I1203 20:30:00.393243 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.393532 master-0 kubenswrapper[29252]: I1203 20:30:00.393361 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2sg\" (UniqueName: \"kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.394704 master-0 kubenswrapper[29252]: I1203 20:30:00.394663 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.397892 master-0 kubenswrapper[29252]: I1203 20:30:00.397855 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.409716 master-0 kubenswrapper[29252]: I1203 20:30:00.409659 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2sg\" (UniqueName: \"kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg\") pod \"collect-profiles-29413230-czjrx\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.494981 master-0 kubenswrapper[29252]: I1203 20:30:00.494912 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:00.807738 master-0 kubenswrapper[29252]: I1203 20:30:00.807527 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/3.log" Dec 03 20:30:00.819167 master-0 kubenswrapper[29252]: I1203 20:30:00.819035 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-wnl8p_f749c7f2-1fd7-4078-a92d-0ae5523998ac/cluster-storage-operator/4.log" Dec 03 20:30:00.834955 master-0 kubenswrapper[29252]: I1203 20:30:00.834904 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/3.log" Dec 03 20:30:00.836453 master-0 kubenswrapper[29252]: I1203 20:30:00.836398 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-s29k7_367c2c7c-1fc8-4608-aa94-b64c6c70cc61/snapshot-controller/4.log" Dec 03 20:30:00.850930 master-0 kubenswrapper[29252]: I1203 20:30:00.850876 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/2.log" Dec 03 20:30:00.856074 master-0 kubenswrapper[29252]: I1203 20:30:00.856027 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-4gppw_b84835e3-e8bc-4aa4-a8f3-f9be702a358a/csi-snapshot-controller-operator/3.log" Dec 03 20:30:00.930692 master-0 kubenswrapper[29252]: I1203 20:30:00.930618 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx"] Dec 03 20:30:01.506985 master-0 kubenswrapper[29252]: I1203 20:30:01.506879 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-6b7bcd6566-4wcq2_128ed384-7ab6-41b6-bf45-c8fda917d52f/dns-operator/0.log" Dec 03 20:30:01.518849 master-0 kubenswrapper[29252]: I1203 20:30:01.518802 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-6b7bcd6566-4wcq2_128ed384-7ab6-41b6-bf45-c8fda917d52f/kube-rbac-proxy/0.log" Dec 03 20:30:01.887469 master-0 kubenswrapper[29252]: I1203 20:30:01.887391 29252 generic.go:334] "Generic (PLEG): container finished" podID="0f7b1203-3453-45f5-b1cb-7cadfa86e314" containerID="2c79ff906247c705a09cf6357ac93d885efebfe78c47a0df546f2397f51c21e4" exitCode=0 Dec 03 20:30:01.887469 master-0 kubenswrapper[29252]: I1203 20:30:01.887447 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" event={"ID":"0f7b1203-3453-45f5-b1cb-7cadfa86e314","Type":"ContainerDied","Data":"2c79ff906247c705a09cf6357ac93d885efebfe78c47a0df546f2397f51c21e4"} Dec 03 20:30:01.887742 master-0 kubenswrapper[29252]: I1203 20:30:01.887504 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" event={"ID":"0f7b1203-3453-45f5-b1cb-7cadfa86e314","Type":"ContainerStarted","Data":"0619f56b4dd9b62aa5624667b6946f6d4d4e7944aa53426f8ab52686a8142721"} Dec 03 20:30:01.970848 master-0 kubenswrapper[29252]: I1203 20:30:01.970764 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dbfhg_d196dca7-f940-4aa0-b20a-214d22b62db6/dns/0.log" Dec 03 20:30:01.980797 master-0 kubenswrapper[29252]: I1203 20:30:01.980732 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dbfhg_d196dca7-f940-4aa0-b20a-214d22b62db6/kube-rbac-proxy/0.log" Dec 03 20:30:01.997276 master-0 kubenswrapper[29252]: I1203 20:30:01.997232 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hk22l_56e013ee-ea7a-4780-8986-a7fd1b5a3a3f/dns-node-resolver/0.log" Dec 03 20:30:02.583738 master-0 kubenswrapper[29252]: I1203 20:30:02.583667 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/3.log" Dec 03 20:30:02.601567 master-0 kubenswrapper[29252]: I1203 20:30:02.601520 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-mqpzf_78a864f2-934f-4197-9753-24c9bc7f1fca/etcd-operator/4.log" Dec 03 20:30:03.265022 master-0 kubenswrapper[29252]: I1203 20:30:03.264963 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:03.316896 master-0 kubenswrapper[29252]: I1203 20:30:03.316823 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcdctl/0.log" Dec 03 20:30:03.448572 master-0 kubenswrapper[29252]: I1203 20:30:03.448207 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2sg\" (UniqueName: \"kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg\") pod \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " Dec 03 20:30:03.448572 master-0 kubenswrapper[29252]: I1203 20:30:03.448352 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume\") pod \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " Dec 03 20:30:03.448572 master-0 kubenswrapper[29252]: I1203 20:30:03.448435 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume\") pod \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\" (UID: \"0f7b1203-3453-45f5-b1cb-7cadfa86e314\") " Dec 03 20:30:03.449219 master-0 kubenswrapper[29252]: I1203 20:30:03.449000 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f7b1203-3453-45f5-b1cb-7cadfa86e314" (UID: "0f7b1203-3453-45f5-b1cb-7cadfa86e314"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 20:30:03.466994 master-0 kubenswrapper[29252]: I1203 20:30:03.465501 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f7b1203-3453-45f5-b1cb-7cadfa86e314" (UID: "0f7b1203-3453-45f5-b1cb-7cadfa86e314"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 20:30:03.466994 master-0 kubenswrapper[29252]: I1203 20:30:03.465505 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg" (OuterVolumeSpecName: "kube-api-access-tl2sg") pod "0f7b1203-3453-45f5-b1cb-7cadfa86e314" (UID: "0f7b1203-3453-45f5-b1cb-7cadfa86e314"). InnerVolumeSpecName "kube-api-access-tl2sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:30:03.550545 master-0 kubenswrapper[29252]: I1203 20:30:03.550467 29252 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f7b1203-3453-45f5-b1cb-7cadfa86e314-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:30:03.550545 master-0 kubenswrapper[29252]: I1203 20:30:03.550518 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2sg\" (UniqueName: \"kubernetes.io/projected/0f7b1203-3453-45f5-b1cb-7cadfa86e314-kube-api-access-tl2sg\") on node \"master-0\" DevicePath \"\"" Dec 03 20:30:03.550545 master-0 kubenswrapper[29252]: I1203 20:30:03.550533 29252 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f7b1203-3453-45f5-b1cb-7cadfa86e314-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 20:30:03.559670 master-0 kubenswrapper[29252]: I1203 20:30:03.559616 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xfnv8/perf-node-gather-daemonset-b8c2w" Dec 03 20:30:03.618329 master-0 kubenswrapper[29252]: I1203 20:30:03.618262 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd/0.log" Dec 03 20:30:03.633747 master-0 kubenswrapper[29252]: I1203 20:30:03.633686 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-metrics/0.log" Dec 03 20:30:03.642036 master-0 kubenswrapper[29252]: I1203 20:30:03.642000 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-readyz/0.log" Dec 03 20:30:03.654967 master-0 kubenswrapper[29252]: I1203 20:30:03.654918 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-rev/0.log" Dec 03 20:30:03.669457 master-0 kubenswrapper[29252]: I1203 20:30:03.669405 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/setup/0.log" Dec 03 20:30:03.697584 master-0 kubenswrapper[29252]: I1203 20:30:03.697531 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-ensure-env-vars/0.log" Dec 03 20:30:03.716015 master-0 kubenswrapper[29252]: I1203 20:30:03.715885 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-resources-copy/0.log" Dec 03 20:30:03.758512 master-0 kubenswrapper[29252]: I1203 20:30:03.758456 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_bacd155a-fee3-4e5e-89a2-ab86f401d2ff/installer/0.log" Dec 03 20:30:03.906694 master-0 kubenswrapper[29252]: I1203 20:30:03.906638 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" event={"ID":"0f7b1203-3453-45f5-b1cb-7cadfa86e314","Type":"ContainerDied","Data":"0619f56b4dd9b62aa5624667b6946f6d4d4e7944aa53426f8ab52686a8142721"} Dec 03 20:30:03.906694 master-0 kubenswrapper[29252]: I1203 20:30:03.906690 29252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0619f56b4dd9b62aa5624667b6946f6d4d4e7944aa53426f8ab52686a8142721" Dec 03 20:30:03.907060 master-0 kubenswrapper[29252]: I1203 20:30:03.907031 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413230-czjrx" Dec 03 20:30:04.450243 master-0 kubenswrapper[29252]: I1203 20:30:04.450190 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-65dc4bcb88-59j4p_5decce88-c71e-411c-87b5-a37dd0f77e7b/cluster-image-registry-operator/0.log" Dec 03 20:30:04.450731 master-0 kubenswrapper[29252]: I1203 20:30:04.450705 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-65dc4bcb88-59j4p_5decce88-c71e-411c-87b5-a37dd0f77e7b/cluster-image-registry-operator/1.log" Dec 03 20:30:04.466027 master-0 kubenswrapper[29252]: I1203 20:30:04.465971 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cdvg6_82b6f6a1-aac8-4293-bdf9-8e85ca6d5898/node-ca/0.log" Dec 03 20:30:05.077888 master-0 kubenswrapper[29252]: I1203 20:30:05.077828 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/3.log" Dec 03 20:30:05.081949 master-0 kubenswrapper[29252]: I1203 20:30:05.081909 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/ingress-operator/4.log" Dec 03 20:30:05.098443 master-0 kubenswrapper[29252]: I1203 20:30:05.098400 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-l7bzj_3f69a3c7-cb00-4f28-b1e7-52bcdb53fbbf/kube-rbac-proxy/0.log" Dec 03 20:30:05.809206 master-0 kubenswrapper[29252]: I1203 20:30:05.809146 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rj2f6_8602809a-528e-4f1a-9157-c45f0da4b768/serve-healthcheck-canary/0.log" Dec 03 20:30:06.359610 master-0 kubenswrapper[29252]: I1203 20:30:06.359557 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-59d99f9b7b-h64kt_af2023e1-9c7a-40af-a6bf-fba31c3565b1/insights-operator/4.log" Dec 03 20:30:06.363831 master-0 kubenswrapper[29252]: I1203 20:30:06.363793 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-59d99f9b7b-h64kt_af2023e1-9c7a-40af-a6bf-fba31c3565b1/insights-operator/5.log" Dec 03 20:30:08.146529 master-0 kubenswrapper[29252]: I1203 20:30:08.146489 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/alertmanager/0.log" Dec 03 20:30:08.162168 master-0 kubenswrapper[29252]: I1203 20:30:08.162100 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/config-reloader/0.log" Dec 03 20:30:08.180501 master-0 kubenswrapper[29252]: I1203 20:30:08.180455 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/kube-rbac-proxy-web/0.log" Dec 03 20:30:08.199146 master-0 kubenswrapper[29252]: I1203 20:30:08.199071 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/kube-rbac-proxy/0.log" Dec 03 20:30:08.212753 master-0 kubenswrapper[29252]: I1203 20:30:08.212682 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/kube-rbac-proxy-metric/0.log" Dec 03 20:30:08.229501 master-0 kubenswrapper[29252]: I1203 20:30:08.229431 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/prom-label-proxy/0.log" Dec 03 20:30:08.245540 master-0 kubenswrapper[29252]: I1203 20:30:08.245473 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_208195c1-0fbf-4721-96bb-fcd9e1c0bc8f/init-config-reloader/0.log" Dec 03 20:30:08.297246 master-0 kubenswrapper[29252]: I1203 20:30:08.297174 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-69cc794c58-dhgcv_ba68608f-6b36-455e-b80b-d19237df9312/cluster-monitoring-operator/0.log" Dec 03 20:30:08.317805 master-0 kubenswrapper[29252]: I1203 20:30:08.317721 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-wj8nr_28e4a5ec-9304-475d-8321-13b21985d688/kube-state-metrics/0.log" Dec 03 20:30:08.332092 master-0 kubenswrapper[29252]: I1203 20:30:08.332036 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-wj8nr_28e4a5ec-9304-475d-8321-13b21985d688/kube-rbac-proxy-main/0.log" Dec 03 20:30:08.346677 master-0 kubenswrapper[29252]: I1203 20:30:08.346614 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-wj8nr_28e4a5ec-9304-475d-8321-13b21985d688/kube-rbac-proxy-self/0.log" Dec 03 20:30:08.373588 master-0 kubenswrapper[29252]: I1203 20:30:08.373517 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6b8786b56c-g7dqt_9c6bc36a-ed58-4b4d-b602-14ff2d86e266/metrics-server/0.log" Dec 03 20:30:08.400614 master-0 kubenswrapper[29252]: I1203 20:30:08.400456 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-78df8f7475-2lnwf_5a1955e8-1b5a-40b5-b251-d22d715f0e0b/monitoring-plugin/0.log" Dec 03 20:30:08.427639 master-0 kubenswrapper[29252]: I1203 20:30:08.427588 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2wbs_89f08828-d22f-48a0-b247-fbe323742568/node-exporter/0.log" Dec 03 20:30:08.442526 master-0 kubenswrapper[29252]: I1203 20:30:08.442463 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2wbs_89f08828-d22f-48a0-b247-fbe323742568/kube-rbac-proxy/0.log" Dec 03 20:30:08.458497 master-0 kubenswrapper[29252]: I1203 20:30:08.458438 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2wbs_89f08828-d22f-48a0-b247-fbe323742568/init-textfile/0.log" Dec 03 20:30:08.476079 master-0 kubenswrapper[29252]: I1203 20:30:08.476023 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-4j26k_70f550ce-35e6-482b-a7ff-4a8c11569406/kube-rbac-proxy-main/0.log" Dec 03 20:30:08.487600 master-0 kubenswrapper[29252]: I1203 20:30:08.487453 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-4j26k_70f550ce-35e6-482b-a7ff-4a8c11569406/kube-rbac-proxy-self/0.log" Dec 03 20:30:08.501325 master-0 kubenswrapper[29252]: I1203 20:30:08.501269 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-4j26k_70f550ce-35e6-482b-a7ff-4a8c11569406/openshift-state-metrics/0.log" Dec 03 20:30:08.534728 master-0 kubenswrapper[29252]: I1203 20:30:08.534669 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/prometheus/0.log" Dec 03 20:30:08.546389 master-0 kubenswrapper[29252]: I1203 20:30:08.546339 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/config-reloader/0.log" Dec 03 20:30:08.559672 master-0 kubenswrapper[29252]: I1203 20:30:08.559610 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/thanos-sidecar/0.log" Dec 03 20:30:08.574316 master-0 kubenswrapper[29252]: I1203 20:30:08.574260 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/kube-rbac-proxy-web/0.log" Dec 03 20:30:08.592628 master-0 kubenswrapper[29252]: I1203 20:30:08.592574 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/kube-rbac-proxy/0.log" Dec 03 20:30:08.610039 master-0 kubenswrapper[29252]: I1203 20:30:08.610002 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/kube-rbac-proxy-thanos/0.log" Dec 03 20:30:08.628648 master-0 kubenswrapper[29252]: I1203 20:30:08.628597 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e8c938b9-2779-4966-bdcb-3dfad66828e2/init-config-reloader/0.log" Dec 03 20:30:08.650789 master-0 kubenswrapper[29252]: I1203 20:30:08.650671 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-565bdcb8-mrkck_49913de2-24ef-452c-b82a-1f613baa7438/prometheus-operator/0.log" Dec 03 20:30:08.663520 master-0 kubenswrapper[29252]: I1203 20:30:08.663479 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-565bdcb8-mrkck_49913de2-24ef-452c-b82a-1f613baa7438/kube-rbac-proxy/0.log" Dec 03 20:30:08.680561 master-0 kubenswrapper[29252]: I1203 20:30:08.680489 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-6d4cbfb4b-vcqfx_8d86deda-4cd7-4ed5-a703-31f644e2947d/prometheus-operator-admission-webhook/0.log" Dec 03 20:30:08.700742 master-0 kubenswrapper[29252]: I1203 20:30:08.700693 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5c6d5cb75d-gcw4h_00d60e6a-6ad3-4109-bb1e-30e656b91dc9/telemeter-client/0.log" Dec 03 20:30:08.713467 master-0 kubenswrapper[29252]: I1203 20:30:08.713415 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5c6d5cb75d-gcw4h_00d60e6a-6ad3-4109-bb1e-30e656b91dc9/reload/0.log" Dec 03 20:30:08.728501 master-0 kubenswrapper[29252]: I1203 20:30:08.728444 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5c6d5cb75d-gcw4h_00d60e6a-6ad3-4109-bb1e-30e656b91dc9/kube-rbac-proxy/0.log" Dec 03 20:30:08.748752 master-0 kubenswrapper[29252]: I1203 20:30:08.748691 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/thanos-query/0.log" Dec 03 20:30:08.763603 master-0 kubenswrapper[29252]: I1203 20:30:08.763530 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/kube-rbac-proxy-web/0.log" Dec 03 20:30:08.776437 master-0 kubenswrapper[29252]: I1203 20:30:08.776362 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/kube-rbac-proxy/0.log" Dec 03 20:30:08.792292 master-0 kubenswrapper[29252]: I1203 20:30:08.792235 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/prom-label-proxy/0.log" Dec 03 20:30:08.811536 master-0 kubenswrapper[29252]: I1203 20:30:08.811319 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/kube-rbac-proxy-rules/0.log" Dec 03 20:30:08.822684 master-0 kubenswrapper[29252]: I1203 20:30:08.822636 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54c84f8475-9xl5s_922b1203-f140-45bb-94a2-6efb31cf5ee8/kube-rbac-proxy-metrics/0.log" Dec 03 20:30:10.491175 master-0 kubenswrapper[29252]: I1203 20:30:10.489926 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-96nlf_1ae49184-91db-4355-b553-8cc5506e80bc/controller/0.log" Dec 03 20:30:10.502978 master-0 kubenswrapper[29252]: I1203 20:30:10.502922 29252 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fh2p"] Dec 03 20:30:10.503764 master-0 kubenswrapper[29252]: E1203 20:30:10.503345 29252 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7b1203-3453-45f5-b1cb-7cadfa86e314" containerName="collect-profiles" Dec 03 20:30:10.503764 master-0 kubenswrapper[29252]: I1203 20:30:10.503362 29252 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7b1203-3453-45f5-b1cb-7cadfa86e314" containerName="collect-profiles" Dec 03 20:30:10.503764 master-0 kubenswrapper[29252]: I1203 20:30:10.503586 29252 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f7b1203-3453-45f5-b1cb-7cadfa86e314" containerName="collect-profiles" Dec 03 20:30:10.504966 master-0 kubenswrapper[29252]: I1203 20:30:10.504942 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.515557 master-0 kubenswrapper[29252]: I1203 20:30:10.515506 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-96nlf_1ae49184-91db-4355-b553-8cc5506e80bc/kube-rbac-proxy/0.log" Dec 03 20:30:10.522088 master-0 kubenswrapper[29252]: I1203 20:30:10.521636 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fh2p"] Dec 03 20:30:10.549822 master-0 kubenswrapper[29252]: I1203 20:30:10.549755 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/controller/0.log" Dec 03 20:30:10.686931 master-0 kubenswrapper[29252]: I1203 20:30:10.686721 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xt6j\" (UniqueName: \"kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.686931 master-0 kubenswrapper[29252]: I1203 20:30:10.686908 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.687244 master-0 kubenswrapper[29252]: I1203 20:30:10.686959 29252 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.759385 master-0 kubenswrapper[29252]: I1203 20:30:10.758688 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/frr/0.log" Dec 03 20:30:10.773036 master-0 kubenswrapper[29252]: I1203 20:30:10.772994 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/reloader/0.log" Dec 03 20:30:10.786361 master-0 kubenswrapper[29252]: I1203 20:30:10.786308 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/frr-metrics/0.log" Dec 03 20:30:10.788220 master-0 kubenswrapper[29252]: I1203 20:30:10.788173 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xt6j\" (UniqueName: \"kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.788220 master-0 kubenswrapper[29252]: I1203 20:30:10.788217 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.788355 master-0 kubenswrapper[29252]: I1203 20:30:10.788266 29252 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.789131 master-0 kubenswrapper[29252]: I1203 20:30:10.789014 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.789219 master-0 kubenswrapper[29252]: I1203 20:30:10.789077 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.803727 master-0 kubenswrapper[29252]: I1203 20:30:10.803664 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/kube-rbac-proxy/0.log" Dec 03 20:30:10.803996 master-0 kubenswrapper[29252]: I1203 20:30:10.803748 29252 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xt6j\" (UniqueName: \"kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j\") pod \"community-operators-9fh2p\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.826805 master-0 kubenswrapper[29252]: I1203 20:30:10.824470 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/kube-rbac-proxy-frr/0.log" Dec 03 20:30:10.844089 master-0 kubenswrapper[29252]: I1203 20:30:10.844043 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-frr-files/0.log" Dec 03 20:30:10.847541 master-0 kubenswrapper[29252]: I1203 20:30:10.847493 29252 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:10.879983 master-0 kubenswrapper[29252]: I1203 20:30:10.879281 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-reloader/0.log" Dec 03 20:30:10.929800 master-0 kubenswrapper[29252]: I1203 20:30:10.929064 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-c7bbm_4977c492-5b52-447d-ab42-4a70601a0da4/cp-metrics/0.log" Dec 03 20:30:11.020084 master-0 kubenswrapper[29252]: I1203 20:30:11.017845 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-58pgc_2bebbe36-46ba-47e9-b53e-2c83abe9c329/frr-k8s-webhook-server/0.log" Dec 03 20:30:11.081251 master-0 kubenswrapper[29252]: I1203 20:30:11.081192 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d9499bd6f-4h4ww_cb6f03f1-adfa-4249-b822-7dd4acf245be/manager/0.log" Dec 03 20:30:11.100059 master-0 kubenswrapper[29252]: I1203 20:30:11.099992 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bfbbd6984-sx9nv_5a2dc21d-ada3-4739-9e62-cbdba8e4985a/webhook-server/0.log" Dec 03 20:30:11.206170 master-0 kubenswrapper[29252]: I1203 20:30:11.205937 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j26tj_52511992-a397-485f-b709-f81257ee8e16/speaker/0.log" Dec 03 20:30:11.221180 master-0 kubenswrapper[29252]: I1203 20:30:11.220723 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j26tj_52511992-a397-485f-b709-f81257ee8e16/kube-rbac-proxy/0.log" Dec 03 20:30:11.381868 master-0 kubenswrapper[29252]: I1203 20:30:11.381813 29252 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fh2p"] Dec 03 20:30:12.004639 master-0 kubenswrapper[29252]: I1203 20:30:12.004584 29252 generic.go:334] "Generic (PLEG): container finished" podID="f258f758-29c1-479d-b450-d0179ea56182" containerID="c1e76f7247f78efd9e9e49f317a2661651562f9ec31c0e099db4024245b5d012" exitCode=0 Dec 03 20:30:12.005262 master-0 kubenswrapper[29252]: I1203 20:30:12.004652 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fh2p" event={"ID":"f258f758-29c1-479d-b450-d0179ea56182","Type":"ContainerDied","Data":"c1e76f7247f78efd9e9e49f317a2661651562f9ec31c0e099db4024245b5d012"} Dec 03 20:30:12.005262 master-0 kubenswrapper[29252]: I1203 20:30:12.004684 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fh2p" event={"ID":"f258f758-29c1-479d-b450-d0179ea56182","Type":"ContainerStarted","Data":"0c1fa4d31311c1239ef164441a0d3ff4c346b204bf5d8379aa8b1b08408adbc4"} Dec 03 20:30:13.026303 master-0 kubenswrapper[29252]: I1203 20:30:13.026237 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fh2p" event={"ID":"f258f758-29c1-479d-b450-d0179ea56182","Type":"ContainerStarted","Data":"4de04bddf4b3a7d40ae0a2bdc0e59e88ef1677a51c0ce6e93b780df4891f25ea"} Dec 03 20:30:13.148611 master-0 kubenswrapper[29252]: I1203 20:30:13.148559 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-vqzdb_7ed25861-1328-45e7-922e-37588a0b019c/cluster-node-tuning-operator/1.log" Dec 03 20:30:13.148857 master-0 kubenswrapper[29252]: I1203 20:30:13.148722 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-vqzdb_7ed25861-1328-45e7-922e-37588a0b019c/cluster-node-tuning-operator/0.log" Dec 03 20:30:13.170444 master-0 kubenswrapper[29252]: I1203 20:30:13.170404 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-l789w_d7171597-cb9a-451c-80a4-64cfccf885f0/tuned/0.log" Dec 03 20:30:14.039065 master-0 kubenswrapper[29252]: I1203 20:30:14.039009 29252 generic.go:334] "Generic (PLEG): container finished" podID="f258f758-29c1-479d-b450-d0179ea56182" containerID="4de04bddf4b3a7d40ae0a2bdc0e59e88ef1677a51c0ce6e93b780df4891f25ea" exitCode=0 Dec 03 20:30:14.039065 master-0 kubenswrapper[29252]: I1203 20:30:14.039066 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fh2p" event={"ID":"f258f758-29c1-479d-b450-d0179ea56182","Type":"ContainerDied","Data":"4de04bddf4b3a7d40ae0a2bdc0e59e88ef1677a51c0ce6e93b780df4891f25ea"} Dec 03 20:30:14.748553 master-0 kubenswrapper[29252]: I1203 20:30:14.748485 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/4.log" Dec 03 20:30:14.832866 master-0 kubenswrapper[29252]: I1203 20:30:14.832731 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-9t9fn_943feb0d-7d31-446a-9100-dfc4ef013d12/kube-apiserver-operator/5.log" Dec 03 20:30:15.596911 master-0 kubenswrapper[29252]: I1203 20:30:15.596859 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_186cc14f-5f58-43ca-8ffa-db07606ff0f7/installer/0.log" Dec 03 20:30:15.620032 master-0 kubenswrapper[29252]: I1203 20:30:15.619971 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_e73e6013-87fc-40e2-a573-39930828faa7/installer/0.log" Dec 03 20:30:15.639526 master-0 kubenswrapper[29252]: I1203 20:30:15.639482 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_bfb85302-c965-417f-8c35-9aff2e464281/installer/0.log" Dec 03 20:30:15.662686 master-0 kubenswrapper[29252]: I1203 20:30:15.662621 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_108176a9-101d-4204-8ed3-4ed41ccdaae0/installer/0.log" Dec 03 20:30:15.885185 master-0 kubenswrapper[29252]: I1203 20:30:15.885137 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/kube-apiserver/0.log" Dec 03 20:30:15.896750 master-0 kubenswrapper[29252]: I1203 20:30:15.896701 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/kube-apiserver-cert-syncer/0.log" Dec 03 20:30:15.912694 master-0 kubenswrapper[29252]: I1203 20:30:15.912621 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/kube-apiserver-cert-regeneration-controller/0.log" Dec 03 20:30:15.942572 master-0 kubenswrapper[29252]: I1203 20:30:15.942048 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/kube-apiserver-insecure-readyz/0.log" Dec 03 20:30:15.966991 master-0 kubenswrapper[29252]: I1203 20:30:15.966480 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/kube-apiserver-check-endpoints/0.log" Dec 03 20:30:15.986635 master-0 kubenswrapper[29252]: I1203 20:30:15.986587 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_f5aa2d6b41f5e21a89224256dc48af14/setup/0.log" Dec 03 20:30:16.059587 master-0 kubenswrapper[29252]: I1203 20:30:16.059530 29252 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fh2p" event={"ID":"f258f758-29c1-479d-b450-d0179ea56182","Type":"ContainerStarted","Data":"f5b9ad43c0d55cd2fde3b5a0ae68e28832c86a416987e041cc167ae4bda489e8"} Dec 03 20:30:16.104756 master-0 kubenswrapper[29252]: I1203 20:30:16.104669 29252 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fh2p" podStartSLOduration=3.055247611 podStartE2EDuration="6.104641943s" podCreationTimestamp="2025-12-03 20:30:10 +0000 UTC" firstStartedPulling="2025-12-03 20:30:12.006203585 +0000 UTC m=+1246.819748538" lastFinishedPulling="2025-12-03 20:30:15.055597917 +0000 UTC m=+1249.869142870" observedRunningTime="2025-12-03 20:30:16.092709196 +0000 UTC m=+1250.906254169" watchObservedRunningTime="2025-12-03 20:30:16.104641943 +0000 UTC m=+1250.918186896" Dec 03 20:30:16.749991 master-0 kubenswrapper[29252]: I1203 20:30:16.749944 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/kube-rbac-proxy/0.log" Dec 03 20:30:16.766866 master-0 kubenswrapper[29252]: I1203 20:30:16.766648 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/manager/1.log" Dec 03 20:30:16.767971 master-0 kubenswrapper[29252]: I1203 20:30:16.767946 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-xfv5j_1f82c7a1-ec21-497d-86f2-562cafa7ace7/manager/0.log" Dec 03 20:30:17.388287 master-0 kubenswrapper[29252]: I1203 20:30:17.388235 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-fzh89_78275772-3b78-4283-8b15-f28695b4a15f/cert-manager-controller/0.log" Dec 03 20:30:17.412660 master-0 kubenswrapper[29252]: I1203 20:30:17.412611 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-fdb6t_f07a0e83-7142-455c-bbbd-5b7b10b03bc0/cert-manager-cainjector/0.log" Dec 03 20:30:17.432555 master-0 kubenswrapper[29252]: I1203 20:30:17.432506 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-dccxw_ad5369d7-39be-4259-a3b3-c67744ea990a/cert-manager-webhook/0.log" Dec 03 20:30:17.994662 master-0 kubenswrapper[29252]: I1203 20:30:17.994599 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5_26c59077-74ee-4b0e-bda7-06c2b0a2cae4/extract/0.log" Dec 03 20:30:18.009752 master-0 kubenswrapper[29252]: I1203 20:30:18.009681 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5_26c59077-74ee-4b0e-bda7-06c2b0a2cae4/util/0.log" Dec 03 20:30:18.024972 master-0 kubenswrapper[29252]: I1203 20:30:18.024899 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864xp9w5_26c59077-74ee-4b0e-bda7-06c2b0a2cae4/pull/0.log" Dec 03 20:30:18.043112 master-0 kubenswrapper[29252]: I1203 20:30:18.043060 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cd89994b5-78ft8_b6ca362c-809a-47d1-8b68-9848967d382a/manager/0.log" Dec 03 20:30:18.055295 master-0 kubenswrapper[29252]: I1203 20:30:18.055215 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cd89994b5-78ft8_b6ca362c-809a-47d1-8b68-9848967d382a/kube-rbac-proxy/0.log" Dec 03 20:30:18.075892 master-0 kubenswrapper[29252]: I1203 20:30:18.075833 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f8856dd79-fsg7r_4bfaaa2e-15b3-40fb-93c2-994c4a38559d/manager/0.log" Dec 03 20:30:18.087030 master-0 kubenswrapper[29252]: I1203 20:30:18.086991 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f8856dd79-fsg7r_4bfaaa2e-15b3-40fb-93c2-994c4a38559d/kube-rbac-proxy/0.log" Dec 03 20:30:18.109022 master-0 kubenswrapper[29252]: I1203 20:30:18.108972 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84bc9f68f5-pnq2w_615341bc-bf59-4b24-9baa-3223edd30ad0/manager/0.log" Dec 03 20:30:18.118883 master-0 kubenswrapper[29252]: I1203 20:30:18.118840 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84bc9f68f5-pnq2w_615341bc-bf59-4b24-9baa-3223edd30ad0/kube-rbac-proxy/0.log" Dec 03 20:30:18.141420 master-0 kubenswrapper[29252]: I1203 20:30:18.141370 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78cd4f7769-7lgz9_c2fed802-28a0-40d3-b422-581c334d8bc5/manager/0.log" Dec 03 20:30:18.256127 master-0 kubenswrapper[29252]: I1203 20:30:18.256022 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78cd4f7769-7lgz9_c2fed802-28a0-40d3-b422-581c334d8bc5/kube-rbac-proxy/0.log" Dec 03 20:30:18.277245 master-0 kubenswrapper[29252]: I1203 20:30:18.277201 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7fd96594c7-bhdr8_7f965ddc-ff13-4f8e-b20c-aad918a7be33/manager/0.log" Dec 03 20:30:18.296002 master-0 kubenswrapper[29252]: I1203 20:30:18.295950 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7fd96594c7-bhdr8_7f965ddc-ff13-4f8e-b20c-aad918a7be33/kube-rbac-proxy/0.log" Dec 03 20:30:18.314663 master-0 kubenswrapper[29252]: I1203 20:30:18.314612 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-f6cc97788-v6spm_4cf77700-7d9a-4d7e-bf0a-71777fa32e55/manager/0.log" Dec 03 20:30:18.324629 master-0 kubenswrapper[29252]: I1203 20:30:18.324584 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-f6cc97788-v6spm_4cf77700-7d9a-4d7e-bf0a-71777fa32e55/kube-rbac-proxy/0.log" Dec 03 20:30:18.342601 master-0 kubenswrapper[29252]: I1203 20:30:18.342557 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c9d7fd8-jcxjt_e0c29a23-11dd-445c-8ebf-cef7994d7bc3/manager/0.log" Dec 03 20:30:18.352633 master-0 kubenswrapper[29252]: I1203 20:30:18.352590 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c9d7fd8-jcxjt_e0c29a23-11dd-445c-8ebf-cef7994d7bc3/kube-rbac-proxy/0.log" Dec 03 20:30:18.373979 master-0 kubenswrapper[29252]: I1203 20:30:18.373923 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7c9bfd6967-kgmrh_7d7bb0ae-4a5d-4196-a340-51fca6907f3a/manager/0.log" Dec 03 20:30:18.383682 master-0 kubenswrapper[29252]: I1203 20:30:18.383632 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7c9bfd6967-kgmrh_7d7bb0ae-4a5d-4196-a340-51fca6907f3a/kube-rbac-proxy/0.log" Dec 03 20:30:18.400983 master-0 kubenswrapper[29252]: I1203 20:30:18.400934 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-58b8dcc5fb-t5dvt_28196001-edc5-4152-830f-7712255d742c/manager/0.log" Dec 03 20:30:18.411324 master-0 kubenswrapper[29252]: I1203 20:30:18.411284 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-58b8dcc5fb-t5dvt_28196001-edc5-4152-830f-7712255d742c/kube-rbac-proxy/0.log" Dec 03 20:30:18.427913 master-0 kubenswrapper[29252]: I1203 20:30:18.427842 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56f9fbf74b-q66vk_64a91869-6d5d-4f4a-8f51-9eab613c4b13/manager/0.log" Dec 03 20:30:18.439432 master-0 kubenswrapper[29252]: I1203 20:30:18.439394 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56f9fbf74b-q66vk_64a91869-6d5d-4f4a-8f51-9eab613c4b13/kube-rbac-proxy/0.log" Dec 03 20:30:18.457717 master-0 kubenswrapper[29252]: I1203 20:30:18.457668 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-647d75769b-qw258_12773d43-060c-4f0c-8a6c-615a6f577894/manager/0.log" Dec 03 20:30:18.468611 master-0 kubenswrapper[29252]: I1203 20:30:18.468568 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-647d75769b-qw258_12773d43-060c-4f0c-8a6c-615a6f577894/kube-rbac-proxy/0.log" Dec 03 20:30:18.487754 master-0 kubenswrapper[29252]: I1203 20:30:18.487698 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cdd6b54fb-5fcb6_d3595782-c6a7-4f72-99fb-44f3a68f1f6d/manager/0.log" Dec 03 20:30:18.499877 master-0 kubenswrapper[29252]: I1203 20:30:18.499766 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cdd6b54fb-5fcb6_d3595782-c6a7-4f72-99fb-44f3a68f1f6d/kube-rbac-proxy/0.log" Dec 03 20:30:18.520912 master-0 kubenswrapper[29252]: I1203 20:30:18.520792 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-865fc86d5b-twshg_d27a7a24-0257-494c-9cc6-889c8a971e81/manager/0.log" Dec 03 20:30:18.537066 master-0 kubenswrapper[29252]: I1203 20:30:18.537027 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-865fc86d5b-twshg_d27a7a24-0257-494c-9cc6-889c8a971e81/kube-rbac-proxy/0.log" Dec 03 20:30:18.601578 master-0 kubenswrapper[29252]: I1203 20:30:18.601527 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-845b79dc4f-kp2nj_12a31e7e-9fbb-49e1-b779-6050f8898ce3/manager/0.log" Dec 03 20:30:18.612847 master-0 kubenswrapper[29252]: I1203 20:30:18.612802 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-845b79dc4f-kp2nj_12a31e7e-9fbb-49e1-b779-6050f8898ce3/kube-rbac-proxy/0.log" Dec 03 20:30:18.631871 master-0 kubenswrapper[29252]: I1203 20:30:18.631749 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp_b08561ad-441a-4ed6-b8d2-4af65531b047/manager/0.log" Dec 03 20:30:18.645496 master-0 kubenswrapper[29252]: I1203 20:30:18.645462 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cb6d6b947lvjwp_b08561ad-441a-4ed6-b8d2-4af65531b047/kube-rbac-proxy/0.log" Dec 03 20:30:18.682952 master-0 kubenswrapper[29252]: I1203 20:30:18.682909 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57d98476c4-g442r_0c987116-b442-4fd5-b528-bb2540c8c37c/manager/0.log" Dec 03 20:30:18.826584 master-0 kubenswrapper[29252]: I1203 20:30:18.826471 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b84d49558-q6t9g_385e1bf4-f4ce-4589-92ad-124932b9c490/operator/0.log" Dec 03 20:30:18.848081 master-0 kubenswrapper[29252]: I1203 20:30:18.848031 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4sgv6_bf8c7606-3354-4927-931d-a0ca4721acd6/registry-server/0.log" Dec 03 20:30:18.863270 master-0 kubenswrapper[29252]: I1203 20:30:18.863228 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-647f96877-h76kk_2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0/manager/0.log" Dec 03 20:30:18.875495 master-0 kubenswrapper[29252]: I1203 20:30:18.875457 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-647f96877-h76kk_2fe48ec1-85b8-48dc-b0b5-8f3a5dc91dd0/kube-rbac-proxy/0.log" Dec 03 20:30:18.895218 master-0 kubenswrapper[29252]: I1203 20:30:18.895175 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6b64f6f645-vvm54_eb78027d-6293-4a9a-961a-d4b57eb0e5f5/manager/0.log" Dec 03 20:30:18.906425 master-0 kubenswrapper[29252]: I1203 20:30:18.906379 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6b64f6f645-vvm54_eb78027d-6293-4a9a-961a-d4b57eb0e5f5/kube-rbac-proxy/0.log" Dec 03 20:30:18.923796 master-0 kubenswrapper[29252]: I1203 20:30:18.923741 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-78955d896f-nlszs_88b03a41-99d3-4be2-913d-6a6ce4ad4b78/operator/0.log" Dec 03 20:30:18.941156 master-0 kubenswrapper[29252]: I1203 20:30:18.941103 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-696b999796-z7xsv_90e0e11c-59df-46fd-9d7e-4c77a66cab18/manager/0.log" Dec 03 20:30:18.952942 master-0 kubenswrapper[29252]: I1203 20:30:18.952891 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-696b999796-z7xsv_90e0e11c-59df-46fd-9d7e-4c77a66cab18/kube-rbac-proxy/0.log" Dec 03 20:30:18.976215 master-0 kubenswrapper[29252]: I1203 20:30:18.976156 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7b5867bfc7-lf5ts_7363059f-f7ee-4bb1-a028-f021e2e51f8e/manager/0.log" Dec 03 20:30:18.985164 master-0 kubenswrapper[29252]: I1203 20:30:18.985109 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7b5867bfc7-lf5ts_7363059f-f7ee-4bb1-a028-f021e2e51f8e/kube-rbac-proxy/0.log" Dec 03 20:30:18.997604 master-0 kubenswrapper[29252]: I1203 20:30:18.997554 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-57dfcdd5b8-tgvnm_7b7362c6-9cc4-45dd-8a04-614481022860/manager/0.log" Dec 03 20:30:19.027234 master-0 kubenswrapper[29252]: I1203 20:30:19.027169 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-57dfcdd5b8-tgvnm_7b7362c6-9cc4-45dd-8a04-614481022860/kube-rbac-proxy/0.log" Dec 03 20:30:19.047392 master-0 kubenswrapper[29252]: I1203 20:30:19.047337 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9b669fdb-nbjnc_f3ed1633-722c-4440-95db-2b644be51ba9/manager/0.log" Dec 03 20:30:19.060097 master-0 kubenswrapper[29252]: I1203 20:30:19.060031 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9b669fdb-nbjnc_f3ed1633-722c-4440-95db-2b644be51ba9/kube-rbac-proxy/0.log" Dec 03 20:30:20.487027 master-0 kubenswrapper[29252]: I1203 20:30:20.486971 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-9nn22_7010359f-6e7a-41c9-9a49-f29d67babf3c/nmstate-console-plugin/0.log" Dec 03 20:30:20.503304 master-0 kubenswrapper[29252]: I1203 20:30:20.503241 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zfhdt_0944c190-d25f-481e-b59a-75869f8dc9e2/nmstate-handler/0.log" Dec 03 20:30:20.525737 master-0 kubenswrapper[29252]: I1203 20:30:20.525645 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xmxkb_6980a7eb-a3c8-4496-87aa-b56680009c84/nmstate-metrics/0.log" Dec 03 20:30:20.537706 master-0 kubenswrapper[29252]: I1203 20:30:20.537665 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-xmxkb_6980a7eb-a3c8-4496-87aa-b56680009c84/kube-rbac-proxy/0.log" Dec 03 20:30:20.557878 master-0 kubenswrapper[29252]: I1203 20:30:20.557816 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-l9xts_bc12a15e-d84d-430c-a33c-833407ab976d/nmstate-operator/0.log" Dec 03 20:30:20.572997 master-0 kubenswrapper[29252]: I1203 20:30:20.572953 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9cg4d_cabd9912-85fb-4fca-a116-1c9bf1ab19e1/nmstate-webhook/0.log" Dec 03 20:30:20.847996 master-0 kubenswrapper[29252]: I1203 20:30:20.847880 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:20.848426 master-0 kubenswrapper[29252]: I1203 20:30:20.848371 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:20.898309 master-0 kubenswrapper[29252]: I1203 20:30:20.898259 29252 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:21.157322 master-0 kubenswrapper[29252]: I1203 20:30:21.157245 29252 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:21.224568 master-0 kubenswrapper[29252]: I1203 20:30:21.224520 29252 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fh2p"] Dec 03 20:30:21.237231 master-0 kubenswrapper[29252]: I1203 20:30:21.237168 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-8dbtt_907128fd-4fcf-46cc-b294-19424448ccc9/prometheus-operator/0.log" Dec 03 20:30:21.250633 master-0 kubenswrapper[29252]: I1203 20:30:21.250542 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f66756b69-22wkc_608d3e20-ad16-4dbd-a829-5bfe8e6f345c/prometheus-operator-admission-webhook/0.log" Dec 03 20:30:21.270419 master-0 kubenswrapper[29252]: I1203 20:30:21.270383 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f66756b69-54275_48bd42d2-ab48-4b98-86ce-948b7b70b781/prometheus-operator-admission-webhook/0.log" Dec 03 20:30:21.294991 master-0 kubenswrapper[29252]: I1203 20:30:21.294942 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rnk7l_ad1d4a4f-7aab-4033-b132-5f30d7c5b76a/operator/0.log" Dec 03 20:30:21.317824 master-0 kubenswrapper[29252]: I1203 20:30:21.317740 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-glb28_2e412375-a8c7-4167-adc2-7e8054c3bf4a/perses-operator/0.log" Dec 03 20:30:21.972023 master-0 kubenswrapper[29252]: I1203 20:30:21.971913 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/kube-multus-additional-cni-plugins/0.log" Dec 03 20:30:21.988588 master-0 kubenswrapper[29252]: I1203 20:30:21.988521 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/egress-router-binary-copy/0.log" Dec 03 20:30:22.007483 master-0 kubenswrapper[29252]: I1203 20:30:22.007428 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/cni-plugins/0.log" Dec 03 20:30:22.026896 master-0 kubenswrapper[29252]: I1203 20:30:22.026819 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/bond-cni-plugin/0.log" Dec 03 20:30:22.043888 master-0 kubenswrapper[29252]: I1203 20:30:22.043827 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/routeoverride-cni/0.log" Dec 03 20:30:22.059318 master-0 kubenswrapper[29252]: I1203 20:30:22.059270 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/whereabouts-cni-bincopy/0.log" Dec 03 20:30:22.075181 master-0 kubenswrapper[29252]: I1203 20:30:22.075126 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pwlw2_87f1759a-7df4-442e-a22d-6de8d54be333/whereabouts-cni/0.log" Dec 03 20:30:22.096702 master-0 kubenswrapper[29252]: I1203 20:30:22.096659 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5bdcc987c4-s6wpc_c3afc439-ccaa-4751-95a1-ac7557e326f0/multus-admission-controller/0.log" Dec 03 20:30:22.109260 master-0 kubenswrapper[29252]: I1203 20:30:22.109209 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5bdcc987c4-s6wpc_c3afc439-ccaa-4751-95a1-ac7557e326f0/kube-rbac-proxy/0.log" Dec 03 20:30:22.234691 master-0 kubenswrapper[29252]: I1203 20:30:22.234597 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/0.log" Dec 03 20:30:22.315894 master-0 kubenswrapper[29252]: I1203 20:30:22.315842 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p9sdj_a9d3cd3e-98b8-4a41-a6bc-8837332fb6a6/kube-multus/1.log" Dec 03 20:30:22.343748 master-0 kubenswrapper[29252]: I1203 20:30:22.343691 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hs6gf_46b5d4d0-b841-4e87-84b4-85911ff04325/network-metrics-daemon/0.log" Dec 03 20:30:22.353576 master-0 kubenswrapper[29252]: I1203 20:30:22.353539 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hs6gf_46b5d4d0-b841-4e87-84b4-85911ff04325/kube-rbac-proxy/0.log" Dec 03 20:30:22.969215 master-0 kubenswrapper[29252]: I1203 20:30:22.969157 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_lvms-operator-7d96b77997-8s46z_0a5a9bd4-822a-4f9b-b3f0-1b689bc35857/manager/0.log" Dec 03 20:30:22.991203 master-0 kubenswrapper[29252]: I1203 20:30:22.991132 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-k68ws_ad601dfa-8310-41a9-8abd-119fbed1aa01/vg-manager/1.log" Dec 03 20:30:22.992712 master-0 kubenswrapper[29252]: I1203 20:30:22.992689 29252 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-k68ws_ad601dfa-8310-41a9-8abd-119fbed1aa01/vg-manager/0.log" Dec 03 20:30:23.126076 master-0 kubenswrapper[29252]: I1203 20:30:23.126004 29252 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fh2p" podUID="f258f758-29c1-479d-b450-d0179ea56182" containerName="registry-server" containerID="cri-o://f5b9ad43c0d55cd2fde3b5a0ae68e28832c86a416987e041cc167ae4bda489e8" gracePeriod=2 Dec 03 20:30:23.529956 master-0 kubenswrapper[29252]: I1203 20:30:23.529756 29252 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fh2p" Dec 03 20:30:23.616470 master-0 kubenswrapper[29252]: I1203 20:30:23.616387 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content\") pod \"f258f758-29c1-479d-b450-d0179ea56182\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " Dec 03 20:30:23.616771 master-0 kubenswrapper[29252]: I1203 20:30:23.616570 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities\") pod \"f258f758-29c1-479d-b450-d0179ea56182\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " Dec 03 20:30:23.616771 master-0 kubenswrapper[29252]: I1203 20:30:23.616619 29252 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xt6j\" (UniqueName: \"kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j\") pod \"f258f758-29c1-479d-b450-d0179ea56182\" (UID: \"f258f758-29c1-479d-b450-d0179ea56182\") " Dec 03 20:30:23.618547 master-0 kubenswrapper[29252]: I1203 20:30:23.618460 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities" (OuterVolumeSpecName: "utilities") pod "f258f758-29c1-479d-b450-d0179ea56182" (UID: "f258f758-29c1-479d-b450-d0179ea56182"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:30:23.620358 master-0 kubenswrapper[29252]: I1203 20:30:23.620306 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j" (OuterVolumeSpecName: "kube-api-access-8xt6j") pod "f258f758-29c1-479d-b450-d0179ea56182" (UID: "f258f758-29c1-479d-b450-d0179ea56182"). InnerVolumeSpecName "kube-api-access-8xt6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 20:30:23.716652 master-0 kubenswrapper[29252]: I1203 20:30:23.716494 29252 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f258f758-29c1-479d-b450-d0179ea56182" (UID: "f258f758-29c1-479d-b450-d0179ea56182"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 20:30:23.720010 master-0 kubenswrapper[29252]: I1203 20:30:23.719946 29252 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 20:30:23.720010 master-0 kubenswrapper[29252]: I1203 20:30:23.719997 29252 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f258f758-29c1-479d-b450-d0179ea56182-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 20:30:23.720010 master-0 kubenswrapper[29252]: I1203 20:30:23.720011 29252 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xt6j\" (UniqueName: \"kubernetes.io/projected/f258f758-29c1-479d-b450-d0179ea56182-kube-api-access-8xt6j\") on node \"master-0\" DevicePath \"\""